% % This is a very simple code which aim is to illustrate how to use MEG lab devices for a visual experiment, involving stimuli presentation, response collection, % eye-tacking and triggers (with photodiode correction). The not MEG-relaated code is intended to be only a suggestion, since here the focus is not the % structure of the experiment itself, but how commands related to MEG devices integrate into it. The aim is that the reader can understand & re-use code % snippets from here in experiment programming. Matlab & PsychToolbox knoweledge is assumed. If not the case, I suggest to follow a good tutorial before. % % VERSIONING: 2018/11/27 Davide Tabarelli (davide.tabarelli@unitn.it) First version % VERSIONING: 2021/05/03 Gianpiero Monittola(gianpiero.monittola@unitn.it) Last version % VERSIONING: 2021/05/09 Davide Tabarelli & Gianpiero Monittola Last revision and wiki publication % The experiment consists of a cue followed by a stimulus and has the following instructions: % % "press the red button if stimulus is a beep, press the blue button if stimulus is a gabor ". % "keep the gaze at fixation" % % The structure of the script contains: % % 1) A few "INIT" sections: % 1a) General experiment-level parameters initializations % 1b) Datapixx (blue boxes) initialization % 1c) Psychtoolbox initialization % 1d) Eyelink initialization % 1e) Eyelink calibration % % 2) INSTRUCTIONS to the subject % % 3) A "TRIAL LOOP", cycling trough trials and containing: % 3a) A "INIT TRIAL" section, in which all trial-dependent parameters are initialized % 3b) A session in which "EYELINK RECORDING" starts % 3c) A "PRESENTATION LOOP", allowing for fine control of the trial at each single screen refresh (frame) % % 5) A "CLOSINGS" session, in which devices are gracefully closed and data are locally saved % function sampleExperiment(sub) % 'sub' is subject number %%%%%%%% INIT EXPERIMENT % Here general initialization are performed, including the definition of all the parameters that will not change during the whole experiments. % Examples are the trial timing, color codes etc ... refrate = 120; % In our standard setup DataPixx and projector refresh rate are set to 120 Hz. This means that a "frame" (i.e. the smallest % possible time interval between changes in the projected image) is 8.33 ms. Furthermore, the projector uses a DLP technology, that % means that it is not rendering the image as an usual monitor (by drawing it from the top-left to the bottom-right corner at 120 Hz) % but, instead, it is waiting for all the video buffer to be received and then it draws the whole image at once. This avoids the typical % "rastering" effect but implies that the image is "presented at once" at the end of the refresh interval, i.e: 8.33 ms after the % Psychtoolbox "Flip" command (when the refresh rate is set to 120 Hz). Moreover, we don't have to query PsychToolbox about the real % refresh rate since the DataPixx technology ensures no jittering of the flip interval. numtrials = 10; % Number of trials for condition aud = 1*ones(1,numtrials); % 1: condition auditory vis = 2*ones(1,numtrials); % 2:condition visual % Generate a sequence of random condition trial.conds = [vis,aud]; trial.sequence = RandSample(trial.conds, [1 numtrials]); for t=1:numtrials trial.cuesequence(t) = circshift(trial.sequence(t)', randi(3))'; end % Define some color codes color.bg = [0.5 0.5 0.5]; % Background color color.txt = [1 1 1]; % Text color color.black = [0 0 0]; % Black color color.red = [1 0 0]; % Red color % Trial definition (in seconds) timing.precue = 0; % Trial start, fixation appears timing.cue = 0.35; % Cue onset timing.postcue = 0.5; % End of cue, back on fixation timing.soa = 1.75; % Stimulus onset and start collecting responses timing.resp = 1.9; % Back on fixation and continue collecting responses timing.itij = 3 + 0.05 .* randn(1,numtrials); % End of trial, including a jittered Inter Trial Interval % It is better not to send too many Eyelink commands to the eye-tracker in a row. For this reason, between them, we wait for a short time, here defined. elk.wait = 0.01; % Always a good idea to save experiment details (stimulation plan and responses) on disk. In the worst scenario you loose the information encoded in your % triggers, with the aim of the photodiode and the stimulation plan/results, you can still analyze your data. Here we prepare a structure for this. expdtl.numtrials = numtrials; expdtl.color = color; expdtl.timing = timing; expdtl.conditions = zeros(numtrials, 1); expdtl.button_presses = zeros(numtrials, 1); expdtl.reaction_times = nan(numtrials, 1); expdtl.sequence = trial.sequence; % add gp save expdtl expdtl % add gp %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%% INIT DATAPIXX % Controlling the Datapixx works as follows. The Blue Box keeps, on board, a "remote register", where both commands and information about current % status of the device are stored. The register is replicated on the stimulation computer as a "local register". Synchronization between the two % registers are triggered by "Datapixx('RegWr') and similar commands. There are two types of synchronization commands that differs in the time the % synchronization is actually executed: % % - Datapixx('RegWr') or Datapixx('RegWrRd'): registers are synchronized immediately. 'RegWr' only writes commands TO the % remote register, while 'RegWrRd' also get information FROM the remote register % % - Datapixx('RegWrVideoSync') or Datapixx('RegWrRdVideoSync'): registers are synchronized at the first available video sync from the graphic hardware, % that might (or not) coincide with the next Screen('Flip') Psychtoolbox command % in your code. Again, 'RegWrVideoSync' only writes commands TO the remote % register while 'RegWrRdVideoSync' also gather information FROM the remote % register % % This is the theory. Now, practically, when a "Datapixx('...')" command is executed in the code,f or example to fire a trigger, the correspondent % instructions are stored only in the local register and nothing really happens at the physical devices until the registers are synchronized as above. At % that time, the local register content is transferred to the remote register and commands are really executed (for example a trigger is fired). The same % holds for information stored on the device (like button responses). New responses remains in the remote register until a synchronization, as % Datapixx('RegWrRd*')" read/write commands, is executed: at that time, the local register contains the new responses, and you can use them in your code. Datapixx('Open'); % Open DataPixx Datapixx('SetVideoMode', 0); % This set video mode to normal passthrought, no stereo mode. C24, Straight passthrough from DVI 8-bit RGB to VGA RGB. % In this configuration luminance is linear with RGB (see our wiki). Datapixx('StopAllSchedules'); % Stop all schedules (audio waveforms, triggers etc...) Datapixx('SetDoutValues', 0); % Set digital output to zero, as required to prepare for triggering Datapixx('EnableDinDebounce'); % Enable response debouncing. This is required to prune out spurious button presses after a real response Datapixx('SetDinLog'); % Clear digital input logger, i.e: clear old responses in the register Datapixx('StopDinLog'); % Stop running response logger Datapixx('RegWrRd'); % So far, no real changes occurred on the physical blue box devices. This command synchronize local and remote registers % in a read/write mode and immediately. Only now, the blue box status is as determined by the above initializations. responseButtonsMask = 2^0 + 2^1 + 2^2 + 2^3; % Values of response buttons are stored in a cumbersome binary way. This is a binary mask useful to % transform them in decimal human-readable values. In particular, red = 1, blue = 8, green = 4 and yellow = % 2. It works. Just believe it. %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%% INIT PSYCHTOOLBOX % Standard psychtoolbox commands: we assume everything is almost clear here. PsychDefaultSetup(2); screenNumber = 1; % In the current MEG setup, projector is screen number 1 [windowHandle, windowRect] = PsychImaging('OpenWindow', screenNumber, color.bg); Screen('Flip',windowHandle); width = windowRect(3); % Get screen info height = windowRect(4); [xCenter, yCenter] = RectCenter(windowRect); SetMouse(0,0); % Move mouse away HideCursor; % Hide the mouse cursor %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%% INIT EYELINK % Here the eyelink is initialized using Psychtoolbox function "Eyelink('...'). Settings here are almost standard and more information about it can be found % in the EyelinkToolbox documentation in your Psychtoolbox folder. % This code initializes the connection with the eyelink: if something fails, it exit program with error if EyelinkInit()~= 1; error('Eyelink disconnected !!!'); end; % We need to provide Eyelink with details about the graphics environment and perform some initializations. The initialization information is returned in a % structure that also contains useful defaults and control codes (e.g. tracker state bit and Eyelink key values). The structure, moreover, act asn an handle % for subsequent commands, like "windowHandle" for Psychtoolbox. elk.el = EyelinkInitDefaults(windowHandle); % Here we create the name for the eyelink datafile. Data gathered from the eye tracker are saved on the eye-tracking PC in a file. Data from all users are % saved in the same folder and the folder is routinely cleaned up without any advice. So, be sure to copy your data after the experiment and choose an % unique name for the datafile (containing date/time, subject number etc...). It has to be less than 8 characters long. % elk.edfFile = sprintf('sx%d%s.edf', sub, datestr(now, 'HHMM')); % Create file name Eyelink('Openfile', elk.edfFile); % Open the file to the eye-tracker % Writing a short preamble to the file helps if the name became not that informative ;-) Eyelink('command', sprintf('add_file_preamble_text ''A simple sample experiment for the CIMeC MEG: sub %d ; time %s''', sub, datestr(now, 'YYYYmmddhhMM'))); % Setting the eye-tracker so as to record GAZE of LEFT and RIGHT eyes, together with pupil AREA Eyelink('Command', 'link_sample_data = LEFT,RIGHT,GAZE,AREA'); % Setting the proper recording resolution, proper calibration type, as well as the data file content Eyelink('command','screen_pixel_coords = %ld %ld %ld %ld', 0, 0, width - 1, height - 1); Eyelink('message', 'DISPLAY_COORDS %ld %ld %ld %ld', 0, 0, width - 1, height - 1); % Setting the proper calibration type. Usually we use 9 points calibration. For a long range mount also 13 points (HV13) is a good (longer) calibration. Eyelink('command', 'calibration_type = HV9'); % Setting the proper data file content Eyelink('command', 'file_event_filter = LEFT,RIGHT,FIXATION,SACCADE,BLINK,MESSAGE,BUTTON'); Eyelink('command', 'file_sample_data = LEFT,RIGHT,GAZE,HREF,AREA,GAZERES,STATUS'); % Setting link data (used for gaze cursor, optional) Eyelink('command', 'link_event_filter = LEFT,RIGHT,FIXATION,SACCADE,BLINK,MESSAGE,BUTTON'); Eyelink('command', 'link_sample_data = LEFT,RIGHT,GAZE,GAZERES,AREA,STATUS'); % Saccade detection thresholds (optional) Eyelink('command', 'saccade_velocity_threshold = 35'); Eyelink('command', 'saccade_acceleration_threshold = 9500'); % Now make sure that we are still connected to the Eyelink ... otherwise throw error if Eyelink('IsConnected')~=1 error('Eyelink disconnected !!!'); end; %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%% EYELINK CALIBRATION % This code allow the EyeLink software to take control of your psychtoolbox screen. This means that at this point you will see participant eys as recorded % by the Ey-tracker camera on the MEG whiteboard, a condition essential for setting up the camera. After setting up the camera you can perform calibration % and validation at this step. % Some calibration parameters elk.el.foregroundcolour = 0; elk.el.backgroundcolour = color.bg(1) * 255; % Give eye-racker control of the screen for camera setup and calibration, until you exit back to psychtoolbox by pressing ESC EyelinkDoTrackerSetup(elk.el); %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%% INSTRUCTIONS Screen('FillRect', windowHandle, color.bg); Screen('TextSize', windowHandle, 24); Screen('TextFont', windowHandle, 'Arial'); Screen('TextColor', windowHandle , color.txt); DrawFormattedText(windowHandle, 'Press the red button only if the color of the fixation and the color of the\n\n subsequent square are the same. \n \n Keep the gaze at fixation', 'center', 'center'); Screen('Flip', windowHandle); %%%%%%%%%%%%%%%%%%%%%%%%%%%% prepare beep freq = 44100; beep(1,:) = MakeBeep(1000, 0.25, freq); % Sample frequency nChannels = size(beep, 1); channelList = [0:nChannels-1]; nTotalFrames = size(beep, 2); % Open Datapixx, and stop any schedules which might already be running Datapixx('Open'); Datapixx('StopAllSchedules'); Datapixx('RegWrRd'); % Synchronize DATAPixx registers to local register cache % Download the entire waveform to the DATAPixx default DAC address of 0. % % Datapixx('WriteDacBuffer', audioData, 0, channelList); Datapixx('WriteDacBuffer', beep, 0, channelList); nFrames = max(size(beep)); lrMode = 3; %%%%%%%%%%%%%%%%%%%%%%%%%%%% prepare gabor ifi = Screen('GetFlipInterval', windowHandle); fr = Screen('NominalFrameRate', windowHandle); % Dimensions gaborDimPix = 150; % Sigma of Gaussian sigma = gaborDimPix / 6; % Obvious other Parameters orientation = 90; contrast = 0.5; aspectRatio = 1.0; % Spatial Frequency (Cycles Per Pixel) % One Cycle = Grey-Black-Grey-White-Grey i.e. One Black and One White Lobe numCycles = 3; Fs = numCycles / gaborDimPix; % Build a procedural gabor texture gabortex = CreateProceduralGabor(windowHandle, gaborDimPix, gaborDimPix, [], [0.5 0.5 0.5 0.0], 1, 0.5); % number of gabors needed nGabors = 1; % Drift speed for the 2D global motion degPerSec = 360 * 4; degPerFrame = degPerSec * ifi; % Uncomment the following part if random angles are needed % Randomise the Gabor orientations and determine the drift speeds of each gabor. % This is given by multiplying the global motion speed by the cosine % difference between the global motion direction and the global motion. % Here the global motion direction is 0. So it is just the cosine of the % angle we use. We re-orientate the array when drawing %gaborAngles = rand(1, nGabors) .* 180 - 90; gaborAngles = 45; %one deg only, 45 degPerFrameGabors = cosd(gaborAngles) .* degPerFrame; % Randomise the phase of the Gabor phaseLine = rand(1, 1) .* 360; propertiesMat = repmat([NaN, Fs, sigma, contrast, aspectRatio, 0, 0, 0], nGabors, 1); propertiesMat(:, 1) = phaseLine'; % Numer of frames to wait before re-drawing % waitframes = 1; %%%%%%%% TRIAL LOOP for iTrial=1:length(trial.sequence) %%%%%%%% TRIAL INIT % It is not the case in this simple experiment but, if required, you can set here trial dependent features that cannot be pre-set in the experiment % initialization (i.e: accuracy dependent stimulus features etc ...) % We reset to background color the whole screen at the beginning of each trial Screen('FillRect', windowHandle, color.bg); % In our setup we privide also an 'old-fashioned' triggering method, a.k.a. photodiode. This consists in a photodiode that can be mounted everywhere on % the whiteboard and that is connected to the analog input channel #8 of the MEG (the MISC channel #8). The photodiode is useful when you want a very % precise synchronization of your triggers with respect to the stimuli or as a backup, in the case at some point the digital triggers sent to the MEG % digital input fail for any reason. In order to use the photodiode, you have to keep a black small square on the screen, that is covered by the % photodiode collector fiber and thus not visible to the participant. Every time you "flash" to white the square for a few milliseconds you get an % analog trigger in the MISC channel #8, that is synchronized with the actual image drawing. Here we prepare the draw a black square for the photodiode at % the top-left corner (from participant view). % Screen('FillRect', windowHandle, [0 0 0], [0 0 30 30]); % In this experiment we pool for subject responses at each frame, but only in a predefined interval, considered valid for collecting responses. This % flag will be set to true (enabling response collection) only at that point. responsePoolEnabled = false; %%%%%%%% EYELINK RECORDING % Here we record eyelink data at a trail level, resulting in an edf file with trials as defined in the experiment. For this reason we send a trial id % and some trial information to the eyelink before starting the recording. Eyelink('Message', 'TRIALID %d', iTrial); Eyelink('command', 'record_status_message "TRIAL %d (see stimplan for sub = %d)"', iTrial, sub); % As specified before, it is better not to send to many Eyelink commands to the eye-tracker in a row. So, after the command, we wait the pre-set time WaitSecs(elk.wait); % Here we start recording eyelink data (left/right gaze and pupil size), preceded by a short pause Eyelink('Command', 'set_idle_mode'); WaitSecs(elk.wait); Eyelink('StartRecording', 1, 1, 1, 1); % A few samples of free eyelink recording are left before we actually start displaying things on the screen. Otherwise you may los a few msec of data % from you real trial. WaitSecs(elk.wait); %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%% PRESENTATION AUDIO if trial.sequence(iTrial)==1 % if audio Screen('FrameOval', windowHandle, color.red, CenterRectOnPoint([0 0 40 40], xCenter, yCenter), 3, 3); % Outer fixation ring Screen('FrameOval', windowHandle, color.red, CenterRectOnPoint([0 0 20 20], xCenter, yCenter), 3, 3); % Inner fixation ring Screen('FillRect', windowHandle, [1 1 1], [0 0 30 30]); Datapixx('SetDacSchedule', 0, freq, nTotalFrames*1, channelList, 0, nTotalFrames); Datapixx('StartDacSchedule'); % Datapixx('RegWrRdVideoSync'); % start play the sound at the next screen flip WaitSecs(0.5) %% trigger triggerPulse = [1 0] .* iTrial; Datapixx('StopDoutSchedule'); Datapixx('WriteDoutBuffer', triggerPulse); Datapixx('SetDoutSchedule', 1.0/refrate, 100, 2); % Delayed trigger (1/refresh delay rate with ProPixx) Datapixx('StartDoutSchedule'); Datapixx('RegWrVideoSync'); % Display current frame Screen('Flip', windowHandle); else % video %%%%%%%% PRESENTATION GABOR % This is the presentation loop, framed in flip intervals. Recall that here a 'frame' correspond to a video refresh and represent the smallest possible % time interval between changes in the projected image (8.33 ms @ 120 Hz). The frame corresponding to a certain trial time can be computed by ceiling % the trial time of interest (in seconds) divided by the refresh rate (plus one since frame = 1 means time = 0). In the current design we cycle over all % frames and, at each frame, we specify the status of the stimuli on the screen and the peripheral devices. This is not the unique way to program a % visual experiment but is functional to explain how the MEG devices works. % Cycling on frames: the total number of frames corresponds to the end of the trail. for frame = 1:ceil(timing.itij(trial.sequence) * refrate) + 1 % At each frame we reset the square for photodiode always to black. If the case, it will be turned into white in subsequent commands. Screen('FillRect', windowHandle, [0 0 0], [0 0 30 30]); % Here we keep only fixation circle at screen from "Trial start" to "Cue onset". if (frame > ceil(timing.precue * refrate)) && (frame <= ceil( timing.cue * refrate)) % Draw fixation Screen('FrameOval', windowHandle, [0 0 0], CenterRectOnPoint([0 0 40 40], xCenter, yCenter), 3, 3); % Outer fixation ring Screen('FrameOval', windowHandle, [0 0 0], CenterRectOnPoint([0 0 20 20], xCenter, yCenter), 3, 3); % Inner fixation ring end % Here we keep on screen the cue, in the form of a colored fixation circle, from "Cue onset" to "End of cue". if (frame > ceil(timing.cue * refrate)) && (frame <= ceil( timing.postcue * refrate)) % Draw cue as colored fixation Screen('FrameOval', windowHandle, color.txt, CenterRectOnPoint([0 0 40 40], xCenter, yCenter), 3, 3); % Outer fixation ring Screen('FrameOval', windowHandle, color.txt, CenterRectOnPoint([0 0 20 20], xCenter, yCenter), 3, 3); % Inner fixation ring end % We keep fixation circle (not coloured) on screen from "End of cue" to "Stimulus onset". if (frame > ceil(timing.postcue * refrate)) && (frame <= ceil( timing.soa * refrate)) % Draw fixation Screen('FrameOval', windowHandle, [0 0 0], CenterRectOnPoint([0 0 40 40], xCenter, yCenter), 3, 3); % Outer fixation ring Screen('FrameOval', windowHandle, [0 0 0], CenterRectOnPoint([0 0 20 20], xCenter, yCenter), 3, 3); % Inner fixation ring end % Now we set the stimulus on screen and we keep it from "Stimulus onset" to "End of stimulus". if (frame > ceil(timing.soa * refrate)) && (frame <= ceil( timing.resp * refrate)) % Stimulus (colored rectangle) Screen('FillRect', windowHandle, color.txt, CenterRectOnPoint([0 0 150 150], xCenter, yCenter)); Screen('DrawTextures', windowHandle, gabortex, [], [], gaborAngles - 90, [], [], [], [],... kPsychDontDoRotation, propertiesMat'); % Increment the phase of our Gabor phaseLine = phaseLine + degPerFrameGabors; propertiesMat(:, 1) = phaseLine'; % photodiode Screen('FillRect', windowHandle, [1 1 1], [0 0 30 30]); end % From "End of stimulus": draw fixation and wait for response for all jittered ITI if frame > ceil( timing.resp * refrate) + 1 % Draw fixation again and continue collecting responses fixcol = [0 0 0]; Screen('FrameOval', windowHandle, fixcol, CenterRectOnPoint([0 0 40 40], xCenter, yCenter), 3, 3); % Outer fixation ring Screen('FrameOval', windowHandle, fixcol, CenterRectOnPoint([0 0 20 20], xCenter, yCenter), 3, 3); % Inner fixation ring end % TRIGGER: "Stimulus onset" if frame == ceil(timing.soa * refrate) + 1 % The MEG can accept trigger values from 0 to 255 on digital input ST101. Usually you want to set a trigger when a relevant event occur and, % additionally, encode some information in it. As an example, here we get the actual consistency condition, based on cue and stimulus color, and % we compute a single number, ranging from 2 to 7, encoding both the consistency and the stimulus color actually presented. We use this number % to code the trial condition information in the MEG trigger. Furthermore, we save the condition value in our locally saved variable and we % send it to the eye-tracker. In this way, the information is redundantly saved, so as to stay on the safe side. % Here we: % 1) check if cue and stimuls are the same ('isConsistent') % 2) Get stimulus color number among predefined conditions ('colorNumber'). % % % % isConsistent = floor(sum(color.cuesequence{t} == color.sequence{t})/3); % % colorNumber = find(sum([color.sequence{t} == color.conds{1}; color.sequence{t} == color.conds{2}; color.sequence{t} == color.conds{3}], 2) == 3); % % % % % % % Now we define condition value encoding on (LSB) the consistency, so that odd values means "not consistent with cue". On top of this we % % % add color numbers. In summary: % % % % % % 2 4 6 are the "non consistent" colors % % % 3 5 7 are the "consistent" colors % % % % % condition = colorNumber + isConsistent; % We save the condition value in the local experimental details variable % RIPARTI DA QUI ::::::: % Here we send a trigger, ncoding the condition value itself. [TODO:]Trigger firing ... spiega in maniera intelligente ... triggerPulse = [1]; Datapixx('StopDoutSchedule'); Datapixx('WriteDoutBuffer', triggerPulse); Datapixx('SetDoutSchedule', 1.0/refrate, 100, 2); % Delayed trigger (1/refresh delay rate with ProPixx) Datapixx('StartDoutSchedule'); % % Eyelink triggering % Eyelink('Message', sprintf('Stimulus onset %d', nConditions)); Eyelink('Message', sprintf('Stimulus onset %d', triggerPulse)); % Set din marker for reaction times. Datapixx('SetMarker'); % Reset and fire up the response logger Datapixx('SetDinLog'); Datapixx('StartDinLog'); % White square for pohotodiode Screen('FillRect', windowHandle, [1 1 1], [0 0 30 30]); end % Sync DataPixx at next VSync. This will execute all pending DataPixx instructions ath the next available video flip command from the video card. % In order to be synchronized with the PsychToolbox onset this MUST BE as close as possible to the next Screen('Flip') command !!! Datapixx('RegWrVideoSync'); % Display current frame Screen('Flip', windowHandle); % Pool responses (if enabled) at every frame if responsePoolEnabled Datapixx('RegWrRd'); % Commit changes to/from DP status = Datapixx('GetDinStatus'); % Get response logger status if status.newLogFrames > 0 % We've got new data in response buffer !!! [data, times] = Datapixx('ReadDinLog'); % Read data in pressedButton = bitand(data(end), responseButtonsMask); secs = times(end); % No need to record this if not saving to beahioural *.mat file ... check. % Eyelink triggering Eyelink('Message', sprintf('Response %d', pressedButton)); % Send RT/response trigger (response value + 128) Datapixx('StopDoutSchedule'); triggerPulse = [1 0] .* (128 + pressedButton); Datapixx('WriteDoutBuffer', triggerPulse); Datapixx('SetDoutSchedule', 0, 100, 2); % No need for programmatic delay here, no wait for projector to fire trigger Datapixx('StartDoutSchedule'); Datapixx('RegWr'); % White square for pohotodiode Screen('FillRect', windowHandle, [1 1 1], [0 0 30 30]); end end end % Stop eyelink Eyelink('StopRecording'); % Update behavioural results (maybe save ... vedi se mettere questa parte nell'esempio) end end %%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%% CLOSINGS % Close Eyelink (mettere codice per salvataggio file ?) Eyelink('Command', 'set_idle_mode'); WaitSecs(0.5); Eyelink('CloseFile'