Viewing Study NCT07262632


Ignite Creation Date: 2025-12-24 @ 9:24 PM
Ignite Modification Date: 2025-12-25 @ 7:10 PM
Study NCT ID: NCT07262632
Status: NOT_YET_RECRUITING
Last Update Posted: 2025-12-03
First Post: 2025-11-20
Is NOT Gene Therapy: False
Has Adverse Events: False

Brief Title: Developing a Multimodal Cancer Pain Database to Support AI-Based Automatic Pain Assessment
Sponsor:
Organization:

Raw JSON

{'hasResults': False, 'derivedSection': {'miscInfoModule': {'versionHolder': '2025-12-24'}, 'conditionBrowseModule': {'meshes': [{'id': 'D000072716', 'term': 'Cancer Pain'}, {'id': 'D009369', 'term': 'Neoplasms'}, {'id': 'D005149', 'term': 'Facial Expression'}, {'id': 'D014828', 'term': 'Vocalization, Animal'}], 'ancestors': [{'id': 'D010146', 'term': 'Pain'}, {'id': 'D009461', 'term': 'Neurologic Manifestations'}, {'id': 'D012816', 'term': 'Signs and Symptoms'}, {'id': 'D013568', 'term': 'Pathological Conditions, Signs and Symptoms'}, {'id': 'D009633', 'term': 'Nonverbal Communication'}, {'id': 'D003142', 'term': 'Communication'}, {'id': 'D001519', 'term': 'Behavior'}, {'id': 'D000819', 'term': 'Animal Communication'}, {'id': 'D001522', 'term': 'Behavior, Animal'}]}}, 'protocolSection': {'designModule': {'studyType': 'OBSERVATIONAL', 'designInfo': {'timePerspective': 'PROSPECTIVE', 'observationalModel': 'CASE_CONTROL'}, 'enrollmentInfo': {'type': 'ESTIMATED', 'count': 200}, 'targetDuration': '3 Weeks', 'patientRegistry': True}, 'statusModule': {'overallStatus': 'NOT_YET_RECRUITING', 'startDateStruct': {'date': '2025-12-01', 'type': 'ESTIMATED'}, 'expandedAccessInfo': {'hasExpandedAccess': False}, 'statusVerifiedDate': '2025-12', 'completionDateStruct': {'date': '2026-12-31', 'type': 'ESTIMATED'}, 'lastUpdateSubmitDate': '2025-12-01', 'studyFirstSubmitDate': '2025-11-20', 'studyFirstSubmitQcDate': '2025-12-01', 'lastUpdatePostDateStruct': {'date': '2025-12-03', 'type': 'ESTIMATED'}, 'studyFirstPostDateStruct': {'date': '2025-12-03', 'type': 'ESTIMATED'}, 'primaryCompletionDateStruct': {'date': '2026-12-31', 'type': 'ESTIMATED'}}, 'outcomesModule': {'primaryOutcomes': [{'measure': 'Audiovisual recordings of facial and vocal pain expression', 'timeFrame': 'Up to 2 weeks', 'description': "The main endpoint consists of audiovisual recordings of each patient's face (from the shoulders up) and voice while the patient reads a short standardised sentence and describes their pain experience. Recordings (maximum 60 seconds after filtering for relevant parts) will be stored digitally and used as input data for the training and validation of an AI-based pain assessment algorithm."}], 'secondaryOutcomes': [{'measure': 'Clinical and patient-reported parameters from electronic health records', 'timeFrame': 'At each recording session', 'description': 'Secondary data used to label the audiovisual recordings and provide context for model development, including patient-reported pain intensity (NRS 0-10), patient-reported change in pain (increasing, decreasing, no change), patient description of pain (short annotations), quality of life (if reported), observer-reported pain intensity (NRS 0-10), analgesics in use (drug name, dose, frequency), cancer tumour type, stage, Karnofsky score, and current treatment, demographics: age, sex, height, weight'}, {'measure': 'Patient-reported mood and pain questionnaire outcomes', 'timeFrame': 'During each recording session', 'description': "Assessment of patients' self-reported mood and perceived pain expression using a short questionnaire administered alongside audiovisual recordings. The questionnaire evaluates: current mood state, perceived intensity and expressiveness of pain, ease or difficulty in expressing pain verbally and non-verbally. Responses will be used to explore associations between subjective reports and multimodal AI-derived pain indicators."}]}, 'oversightModule': {'oversightHasDmc': False, 'isFdaRegulatedDrug': False, 'isFdaRegulatedDevice': False}, 'conditionsModule': {'keywords': ['Cancer-related Pain', 'Pain assessment', 'Artificial Intelligence (AI)', 'Oncology', 'Oncology Pain', 'Database', 'Facial Action Unit', 'Vocalizations', 'Acoustic', 'Automatic assessment', 'Machine learning', 'Audiovisual data', 'Paralinguistics'], 'conditions': ['Cancer-related Pain', 'Pain Assessment', 'Artificial Intelligence (AI)', 'Oncology', 'Oncology Pain', 'Database', 'Facial Expression', 'Voice', 'Cancer']}, 'descriptionModule': {'briefSummary': 'The goal of this observational study is to collect short video and sound recordings of people with cancer to create a secure database that can be used in future research to develop an artificial intelligence (AI) tool for pain assessment. The main aim is to build a large, high-quality collection of audiovisual data showing how people with cancer express themselves when they do and do not have pain.\n\nParticipants will include adults with cancer who are admitted to the oncology ward for pain treatment and a control group admitted for chemotherapy who have no pain. After giving consent, participants will:\n\n* Be recorded on video (from the shoulders up) for up to 60 seconds while reading a short sentence and describing their pain or daily experience.\n* Complete a short questionnaire about their mood and pain expression.\n* Allow researchers to collect some information from their medical record, such as their pain score, medications, and cancer type.\n\nThese recordings will be securely stored and used to create a database for future AI research. No medical tests, new treatments, or extra hospital visits are involved. This study will provide the foundation for developing future AI-based tools that could support doctors and patients in monitoring and managing pain more accurately and easily.', 'detailedDescription': "This is a prospective, single-centre database development study conducted at the Erasmus MC Cancer Institute. The purpose of the study is to create a high-quality audiovisual database for future research on automated pain assessment in people with cancer. The database will include short (up to 60 seconds) video and sound recordings of participants' facial and vocal expressions, alongside relevant clinical and demographic information.\n\nThe study will include two groups: (1) adults admitted to the oncology ward for cancer-related pain (pain group) and (2) adults admitted for chemotherapy who report no pain (control group). Participants will provide written informed consent before any recording takes place.\n\nEach participant will be recorded from the shoulders up while reading a neutral prompt and, if applicable, describing their pain experience. The recordings will be securely stored in compliance with the General Data Protection Regulation (GDPR). In addition, researchers will collect secondary parameters from the participants' electronic medical record, such as pain scores, analgesic use, tumour type, and clinical status. Participants in the pain group may be recorded on multiple days during admission, while controls will be recorded once.\n\nThe audiovisual recordings will be used to extract facial and vocal features (e.g., Facial Action Units, voice parameters) with open-source software such as OpenFace and OpenSmile. These features will form the foundation for future artificial intelligence (AI) model development aimed at automatic pain assessment.\n\nNo experimental interventions, additional clinical procedures, or diagnostic tests are part of this study. The study carries minimal risk to participants, primarily related to the handling of identifiable audiovisual data. All data will be stored on Erasmus MC's secure Research Storage \\& Compute infrastructure, accessible only to authorised researchers.\n\nThe resulting dataset will be used to develop and validate AI models that can objectively estimate pain intensity from audiovisual data, supporting more accurate and continuous pain monitoring in clinical care."}, 'eligibilityModule': {'sex': 'ALL', 'stdAges': ['ADULT', 'OLDER_ADULT'], 'minimumAge': '18 Years', 'samplingMethod': 'PROBABILITY_SAMPLE', 'studyPopulation': 'Hospitalised adult cancer patients recruited from the oncology and palliative care departments of Erasmus University Medical Center (Erasmus MC) in Rotterdam, the Netherlands. The study includes both patients admitted for cancer-related pain and those admitted for treatment without pain. Participants represent an inpatient clinical population from a tertiary referral centre, providing a controlled setting for audiovisual and clinical data collection.', 'healthyVolunteers': False, 'eligibilityCriteria': 'Inclusion Criteria:\n\n* Adult patients (≥18 years)\n* Diagnosed with cancer, active\n* Able to communicate verbally in Dutch or English\n* Able to provide written informed consent.\n* Pain group specific:\n* Experiencing pain related to cancer\n* Admitted to the hospital due to pain\n* Control group specific:\n* Not experiencing pain (NRS = 0)\n* Admitted to the hospital for chemotherapy\n\nExclusion Criteria:\n\n* Cognitive, physical, or medical limitations that prevent participation in the audiovisual recording sessions or affect facial expressions or voice (e.g. facial paralysis, tracheostomy, severe speech impairment).\n* Critical illness or end-of-life care where participation would impose an additional burden.\n* Experiencing pain not associated with cancer\n* Infectious isolation precautions that prevent safe data collection'}, 'identificationModule': {'nctId': 'NCT07262632', 'acronym': 'SENSAI-DBD', 'briefTitle': 'Developing a Multimodal Cancer Pain Database to Support AI-Based Automatic Pain Assessment', 'organization': {'class': 'OTHER', 'fullName': 'Erasmus Medical Center'}, 'officialTitle': 'SENSAI: Seeing, hEaring, seNsing: Smart, Effortless and Objective Pain Assessment With Mobile AI Technology - DataBase Development', 'orgStudyIdInfo': {'id': '11388'}, 'secondaryIdInfos': [{'id': 'MEC-2024-0601', 'type': 'OTHER', 'domain': 'Medical Ethics Review Committee Erasmus University Medical Centre'}]}, 'armsInterventionsModule': {'armGroups': [{'label': 'Pain Group', 'description': 'Adults (18 years and older) with active cancer who are admitted to the oncology ward for cancer-related pain. Participants will be recorded on video (from the shoulders up) for up to 60 seconds while reading a short sentence and describing their pain experience. They will also complete a short questionnaire about their mood and pain expression. Audiovisual data and selected clinical information (such as pain scores and medications) will be collected from their medical record. Participants may be recorded on several consecutive days during admission.', 'interventionNames': ['Other: Audiovisual and questionnaire-based data collection (consecutive)']}, {'label': 'Control Group', 'description': 'Adults (18 years and older) with active cancer who are admitted to the oncology ward for chemotherapy and report no pain (Numerical Rating Scale score = 0). Participants will be recorded on video (from the shoulders up) for up to 60 seconds while reading a short sentence and describing their daily experience. They will also complete a short questionnaire about their mood and pain expression. Only one recording will be made for each participant.', 'interventionNames': ['Other: Audiovisual and questionnaire-based data collection (once)']}], 'interventions': [{'name': 'Audiovisual and questionnaire-based data collection (once)', 'type': 'OTHER', 'otherNames': ['Video recording procedure', 'Mood questionnaire', 'Pain questionnaire'], 'description': 'Participants will be video recorded for up to 60 seconds from the shoulders up while reading a short sentence and describing their pain or daily experience. They will also complete a brief questionnaire about their mood and how they express pain. No drugs, medical devices, or clinical treatments are used. This non-invasive data collection is performed once for participants without pain.', 'armGroupLabels': ['Control Group']}, {'name': 'Audiovisual and questionnaire-based data collection (consecutive)', 'type': 'OTHER', 'otherNames': ['Video recording procedure', 'Mood questionnaire', 'Pain questionnaire'], 'description': 'Participants will be video recorded for up to 60 seconds from the shoulders up while reading a short sentence and describing their pain or daily experience. They will also complete a brief questionnaire about their mood and how they express pain. No drugs, medical devices, or clinical treatments are used. This non-invasive data collection is performed on several consecutive days during admission for participants with pain.', 'armGroupLabels': ['Pain Group']}]}, 'contactsLocationsModule': {'locations': [{'zip': '3015 GD', 'city': 'Rotterdam', 'state': 'South Holland', 'country': 'Netherlands', 'contacts': [{'name': 'Marsha Kamsteeg, MSc', 'role': 'CONTACT', 'email': 'm.kamsteeg@erasmusmc.nl', 'phone': '+31 107040704'}, {'role': 'CONTACT', 'email': 'm.kamsteeg@erasmusmc.nl'}, {'name': 'Marsha Kamsteeg, MSc.', 'role': 'SUB_INVESTIGATOR'}, {'name': 'Mark Mulder, dr.ir.', 'role': 'PRINCIPAL_INVESTIGATOR'}], 'facility': 'Erasmus University Medical Centre', 'geoPoint': {'lat': 51.9225, 'lon': 4.47917}}], 'centralContacts': [{'name': 'Marsha Kamsteeg, MSc', 'role': 'CONTACT', 'email': 'm.kamsteeg@erasmusmc.nl', 'phone': '+31 107040704'}], 'overallOfficials': [{'name': 'Mark Mulder, dr.ir.', 'role': 'PRINCIPAL_INVESTIGATOR', 'affiliation': 'Erasmus University Medical Centre'}]}, 'ipdSharingStatementModule': {'infoTypes': ['STUDY_PROTOCOL', 'SAP', 'ICF'], 'timeFrame': 'The de-identified dataset containing extracted facial and vocal features, along with relevant clinical and demographic variables, will be made available after publication of the main study results and within 12 months of study completion. Data will remain accessible for at least 5 years after the initial release, or longer if there is continued scientific interest and compliance with privacy regulations.', 'ipdSharing': 'YES', 'description': 'Individual participant data (IPD) in the form of non-identifiable, derived features will be shared for research purposes. The shared data will include automatically extracted facial and vocal features (e.g., facial action units, voice parameters) obtained from audiovisual recordings of participants, along with relevant clinical and demographic variables. The original audiovisual recordings will not be shared, as they contain identifiable information (faces and voices).\n\nAccess to the derived dataset will be provided to qualified researchers upon reasonable request and after approval of a data sharing agreement that ensures compliance with privacy regulations, including the General Data Protection Regulation (GDPR). Data will be shared only for scientific and non-commercial research.', 'accessCriteria': 'Access to de-identified participant data and supporting materials will be granted to qualified academic researchers upon reasonable request. Interested researchers must submit a written proposal outlining their research objectives and analysis plan. Access will be provided only after approval of a data sharing agreement that ensures ethical use of the data and compliance with General Data Protection Regulation (GDPR) requirements. The original audiovisual recordings containing identifiable information will not be shared.\n\nData will be made available through controlled access managed by the research team at Erasmus MC.'}, 'sponsorCollaboratorsModule': {'leadSponsor': {'name': 'Dr. Mark Mulder', 'class': 'OTHER'}, 'collaborators': [{'name': 'Delft University of Technology', 'class': 'OTHER'}], 'responsibleParty': {'type': 'SPONSOR_INVESTIGATOR', 'investigatorTitle': 'Principle Investigator', 'investigatorFullName': 'Dr. Mark Mulder', 'investigatorAffiliation': 'Erasmus Medical Center'}}}}