Raw JSON
{'hasResults': False, 'derivedSection': {'miscInfoModule': {'versionHolder': '2025-12-24'}}, 'protocolSection': {'designModule': {'studyType': 'OBSERVATIONAL', 'designInfo': {'timePerspective': 'PROSPECTIVE', 'observationalModel': 'COHORT'}, 'enrollmentInfo': {'type': 'ESTIMATED', 'count': 10200}}, 'statusModule': {'overallStatus': 'RECRUITING', 'startDateStruct': {'date': '2019-03-26', 'type': 'ACTUAL'}, 'expandedAccessInfo': {'hasExpandedAccess': False}, 'statusVerifiedDate': '2025-10-08', 'completionDateStruct': {'date': '2027-12-31', 'type': 'ESTIMATED'}, 'lastUpdateSubmitDate': '2025-10-16', 'studyFirstSubmitDate': '2018-01-20', 'studyFirstSubmitQcDate': '2018-01-20', 'lastUpdatePostDateStruct': {'date': '2025-10-17', 'type': 'ESTIMATED'}, 'studyFirstPostDateStruct': {'date': '2018-01-23', 'type': 'ACTUAL'}, 'primaryCompletionDateStruct': {'date': '2027-12-31', 'type': 'ESTIMATED'}}, 'outcomesModule': {'primaryOutcomes': [{'measure': 'MRI', 'timeFrame': 'duration of the study', 'description': 'We will analyze the anatomical structures of the brain (structural MRI or DTI); amplitude of the BOLD signal (fMRI).'}, {'measure': 'MEG', 'timeFrame': 'duration of the study', 'description': 'We will quantify measures such as power spectrum, event- of task-related potentials, synchronization/de-synchronization and coherence between sensors or sources located close to the brain areas of interest.'}, {'measure': 'Behavioral Measures', 'timeFrame': 'duration of the study', 'description': 'We will quantify measures such as hit rate, reaction times, detection thresholds, eye movement patterns, color, shape and sound judgement. We may measure autonomic data during the course of the sub-study (such as heart rate, respiration, end-tidal CO2, skin conductance), which will be correlated to the outcome measures.'}]}, 'oversightModule': {'isFdaRegulatedDrug': False, 'isFdaRegulatedDevice': False}, 'conditionsModule': {'keywords': ['Magnetic Resonance Imaging (MRI)', 'MEG', 'Color Vision', 'Neuroimaging', 'Natural History'], 'conditions': ['Normal Physiology']}, 'referencesModule': {'seeAlsoLinks': [{'url': 'https://clinicalstudies.info.nih.gov/cgi/detail.cgi?A_2018-EI-0046.html', 'label': 'NIH Clinical Center Detailed Web Page'}]}, 'descriptionModule': {'briefSummary': 'Background:\n\nWhen people see and hear, the brain changes signals from the eyes and ears into perceptions and thoughts. No one fully understands how this happens. Researchers want to explore how healthy brains process sights and sounds.\n\nObjectives:\n\nTo explore how people understand what they see and hear when the brain processes sights and sounds.\n\nEligibility:\n\nParticipants aged 13-65 who have at least 20/40 vision in at least one eye and do not use a hearing aid.\n\nDesign:\n\nSome participants will take tests online anonymously. They will do computer tasks related to colors and behavior.\n\nIn-person participants will be screened with medical history and physical exam. They will complete questionnaires and vision and hearing tests.\n\nParticipants will plan how many testing sessions they will have and when. Sessions last 2-5 hours. They may include:\n\n* Magnetic Resonance Imaging: Magnets and radio waves to take pictures of the brain. Participants will lie on a table that slides in and out of a tube. They will do a task during the scan.\n* Magnetoencephalography: Records magnetic field changes from brain activity. Participants will sit or lie down. A cone will be lowered onto their head. They may do a task during the test.\n* Electromyography: Electrodes attached to the skin will measure the electrical activity of muscles.\n* Electroencephalogram: Electrodes on the scalp will record brain waves.\n* Electrocardiography: Electrodes on the chest will record heart electrical activity.\n* Tests of memory, attention, thinking, vision, and hearing.\n* Eye Tracking: Cameras will follow participants eye movements. They may wear a cap with infrared cameras in front of their eyes.\n\nDuring the sessions, participants vital signs may be monitored.', 'detailedDescription': 'Objective:\n\nThe experiments covered by this protocol aim to uncover basic knowledge about the brain mechanisms that give rise to perception and cognition. The protocol encompasses sub-studies in volunteers ("participants") to uncover behaviors and their physiological basis. The protocol includes only non- or minimally invasive techniques with minimal risk, including psychophysics, functional magnetic resonance imaging (fMRI), magnetoencephalography (MEG), genetic sequencing, and on-line tests using web-based platforms. The overarching goal of the research is to obtain new knowledge in the organization and operation of cortical circuits involved in visual and auditory perception. The protocol covers six specific inter-related aims, with specific sub-studies:\n\n1. Neurophysiological Mechanisms for Color\n2. Object-Color Associations (Color Memory)\n3. Connectivity of Functionally-Defined Regions\n4. Homologies between Monkeys and Humans\n5. Functional Organization of Sound Perception and Visual-Auditory Integration\n6. Multi-stability in Color Perception\n\nStudy Population:\n\nIn-lab or on-line volunteer participants aged 18-65, who are in good general health and have normal or corrected-to-normal vision will be recruited from the local community and studied under this minimal risk protocol. We plan to recruit up to 100 in-person participants, 100 off-site participants, and up to 10,000 on-line volunteers. For off-site participants, the eligibility criteria will be ages 13-65.\n\nDesign:\n\nIn some sub-studies, the same subject will be asked to participate in tasks involving fMRI, MEG and psychophysics, and multiple sessions of each, so that we can control for individual differences in relating the outcomes of each experimental technique. In other sub-studies, participation in multiple tasks will not be required. The extent of the participant involvement, and what they will be asked to do, will be clearly disclosed during consent, as indicated in the consent documents. Brain activity will be monitored by fMRI and/or MEG. Anatomical MRI will be collected in some subjects to allow better localization of brain dynamics. Behavioral tests will be conducted using standard psychophysical approaches, involving presentation of visual and/or auditory stimuli while eye-movements are monitored using non-invasive eye tracking (such as with an infra-red camera directed at the participant\'s eyes). Subjects will be shown simple visual stimuli such as blobs, stripes, and spirals in assorted colors, moving dots, checkerboards, and every-day images and video clips, such as of fruit rolling on a tablecloth, faces, and scenery recorded in a car. Auditory stimuli will include every-day sounds such as birds singing, conversation, whispering, footsteps, car engines, and animal vocalizations, presented within normal sound limits (60- 90 dB). We will not use provocative or sexually explicit images, clips or sounds. During fMRI and MEG sessions, subjects will be asked to free view, passively fixate, or engage in an attentional task during fixation, such as reporting with an eye movement or button press occasions when two images of the same category are presented sequentially.\n\nOutcome Measures:\n\nPerformance on behavioral tasks and brain activity (fMRI and MEG) will be combined to yield information about the neural correlates and processes underlying different aspects of the human neural visual processing stream including color perception, attention, visual discrimination and object/face/place recognition.\n\ni. MRI: to analyze measures such as the anatomical structures of the brain (using structural MRI); amplitude of the blood-oxygenation-level-dependent (BOLD) signal \\& using fMRI).\n\nii. MEG: to quantify measures such as power spectrum, event- or task-related potentials, synchronization/desynchronization, and coherence between sensors or sources located close to the brain areas of interest.\n\niii. Behavioral measures: to quantify measures such as hit rate, reaction times, thresholds, similarity judgments, associations, naming (such as names for color stimuli and sounds) and eye movements.\n\nWe may measure autonomic data during the course of the experiment (such as heart rate, respiration, end-tidal CO2, skin conductance), which will be correlated with the outcome measures.'}, 'eligibilityModule': {'sex': 'ALL', 'stdAges': ['CHILD', 'ADULT', 'OLDER_ADULT'], 'maximumAge': '65 Years', 'minimumAge': '13 Years', 'samplingMethod': 'NON_PROBABILITY_SAMPLE', 'studyPopulation': 'For in-lab testing, two hundred subjects (200) will be recruited from a healthy adult population and are expected to represent a broad cross-section of the population. For on-line testing, we anticipate up to 10,000 subjects will be recruited.', 'healthyVolunteers': True, 'eligibilityCriteria': "* INCLUSION CRITERIA:\n\nInclusion Criteria for In-lab Participants\n\nA subject can be included in the in-lab portion of the study if he/she:\n\n* is in good general health;\n* is between 18 and 65 years old;\n* has visual acuity of 20/40 in at least one eye (corrected with contact lenses is okay);\n* has no hearing impairment requiring a hearing aid.\n* is capable of understanding the procedures and requirements of this study;\n* is willing and able to provide his/her own informed consent.\n\nInclusion Criteria for On-line Participants\n\nA subject can be included in the on-line portion of the study if he/she:\n\n* is in good general health;\n* is between 18 and 65 years old;\n* is capable of understanding the procedures and requirements of this study;\n* is willing and able to provide his/her own informed consent.\n\nInclusion Criteria for Genetic Screening/Off-site Participants\n\nA subject can be included in the off-site portion of the study if he/she:\n\n* is between 13 and 65 years old;\n* is in general good health;\n* is capable of understanding the procedures and requirements of this study;\n* is willing and able to provide his/her own informed consent.\n\nEXCLUSION CRITERIA:\n\nExclusion Criteria for In-lab Participants\n\nA participant is not eligible for participation in the in-lab portion of this study if any of the following exclusion criteria are present, as self-reported by the prospective participant or determined during clinical testing following consent:\n\n* Participant is pregnant\n* Participant has serious vision or hearing problems; for some sub-studies focused on mechanisms of normal color vision, subjects who are colorblind will also be excluded;\n* Participants without consent capacity will not be enrolled;\n* Participant has a debilitating neurological disorder (examples include, but are not limited to: brain tumor, epilepsy, Alzheimer's Disease, Parkinson's Disease, multiple sclerosis) or a psychiatric disorder (examples include, but are not limited to: schizophrenia, clinical anxiety, severe depression, attention deficit hyperactivity disorder (ADHD), schizophrenia). Patients with non-debilitating conditions such as prosopagnosia, prosopometamorphosia (PMO), or aphantasia will have their conditions noted but will not be excluded);\n* Participant has had a serious head injury or has a history of brain surgery. Head injury is defined as an injury to the brain from some external force resulting in loss of consciousness of 30 minutes or more;\n* Participant has psychoactive drug or alcohol abuse or dependence in the past three months, as determined by the Drug Abuse Screening Test (DAST), except nicotine and caffeine. A score of 6 or greater on the DAST will be considered exclusionary. The effects of nicotine and caffeine in neuroimaging are attenuated if participants do not smoke or consume caffeine 2-3 hours before the scan session; Over-the-counter medication/herbals will not be a criterion for exclusion;\n* Participant is an NEI employee within the Perception, Cognition and Action section.\n\nAdditional Exclusion Criteria for MRI Sub-studies\n\nContraindication to MR scanning include the following: metallic tattoos or metallic eyeliner; claustrophobia; inability to lie still on their back for approximately 2 hours; implanted cardiac pacemaker or auto-defibrillator; surgical aneurysm clips; implanted neural stimulator; artificial heart valves or pumps; metal fragments in cranial cavity, body or eyes (e.g., history as a metal worker); nitroglycerin patch (foil backer); cochlear implants (tubes are okay); weight \\> 450 lbs; metal rods, plates, screws in body; shrapnel or bullet wound; intrauterine device (IUD) not approved on mrisafety.com (most IUDs are okay); vestibular or inner ear abnormality such as Meniere's disease; metallic braces; hair extensions attached with metallic wires; transdermal patches; movement disorders; dental implants; consumed of nicotine or caffeine in the two hours prior to the experimental session. Subjects may participate in this study but will not be allowed to have a 7.0 T MRI scan if they have metallic dental crowns or a bridge.\n\nExclusion Criteria for Genetic Screening/Off-site Participants\n\nOff-site participation will be biased to participants with XY chromosomes until the genetic sequencing capability changes. We do not limit to just XY patients to provide the possibility of evaluating newer genetic sequencing capacity and because sequencing on XX can provide some information, even if it is not as decisive as on XY patients.\n\nExclusion Criteria for On-line Participants\n\nSubjects may not participate in the on-line portion of the study if they:\n\n* Do not have access to compatible equipment. For example, smartphone screens are too small to be used. The online platform will outline which devices may be used.\n* Are unwilling to allow permission for JavaScript to run on the site and disable any script blockers.\n* Are unwilling to agree to the on-line testing platform's terms and conditions."}, 'identificationModule': {'nctId': 'NCT03407066', 'briefTitle': 'Perception, Sensation, Cognition and Action in Humans', 'organization': {'class': 'NIH', 'fullName': 'National Institutes of Health Clinical Center (CC)'}, 'officialTitle': 'Perception, Sensation, Cognition and Action in Humans', 'orgStudyIdInfo': {'id': '180046'}, 'secondaryIdInfos': [{'id': '18-EI-0046'}]}, 'armsInterventionsModule': {'armGroups': [{'label': 'In-person', 'description': '200 in-person healthy volunteers'}, {'label': 'On-line', 'description': '10,000 on-line healthy volunteers'}]}, 'contactsLocationsModule': {'locations': [{'zip': '20892', 'city': 'Bethesda', 'state': 'Maryland', 'status': 'RECRUITING', 'country': 'United States', 'facility': 'National Eye Institute (NEI)', 'geoPoint': {'lat': 38.98067, 'lon': -77.10026}}, {'zip': '20892', 'city': 'Bethesda', 'state': 'Maryland', 'status': 'RECRUITING', 'country': 'United States', 'contacts': [{'name': 'For more information at the NIH Clinical Center contact Office of Patient Recruitment (OPR)', 'role': 'CONTACT', 'email': 'prpl@cc.nih.gov', 'phone': '800-411-1222', 'phoneExt': 'TTY8664111010'}], 'facility': 'National Institutes of Health Clinical Center', 'geoPoint': {'lat': 38.98067, 'lon': -77.10026}}], 'centralContacts': [{'name': 'Marianne F Duyck, Ph.D.', 'role': 'CONTACT', 'email': 'marianne.duyck@nih.gov', 'phone': '(301) 402-4956'}, {'name': 'Bevil R Conway, Ph.D.', 'role': 'CONTACT', 'email': 'bevil.conway@nih.gov', 'phone': '(301) 594-3238'}], 'overallOfficials': [{'name': 'Bevil R Conway, Ph.D.', 'role': 'PRINCIPAL_INVESTIGATOR', 'affiliation': 'National Eye Institute (NEI)'}]}, 'ipdSharingStatementModule': {'ipdSharing': 'NO'}, 'sponsorCollaboratorsModule': {'leadSponsor': {'name': 'National Eye Institute (NEI)', 'class': 'NIH'}, 'responsibleParty': {'type': 'SPONSOR'}}}}