Search Articles

View query in Help articles search

Search Results (1 to 10 of 11 Results)

Download search results: CSV END BibTex RIS


Acoustic and Natural Language Markers for Bipolar Disorder: A Pilot, mHealth Cross-Sectional Study

Acoustic and Natural Language Markers for Bipolar Disorder: A Pilot, mHealth Cross-Sectional Study

Multimodal, smartphone-integrated digital assessments could serve as powerful tools for clinical purposes to remotely complement standard mental health evaluations, potentially contributing to distinguish clinical conditions in people with BD. Feasibility of similar systems seems promising, though issues related to privacy, intrusiveness, and clinical therapeutic relationships should be carefully considered.

Cristina Crocamo, Riccardo Matteo Cioni, Aurelia Canestro, Christian Nasti, Dario Palpella, Susanna Piacenti, Alessandra Bartoccetti, Martina Re, Valentina Simonetti, Chiara Barattieri di San Pietro, Maria Bulgheroni, Francesco Bartoli, Giuseppe Carrà

JMIR Form Res 2025;9:e65555

Empowering Mental Health Monitoring Using a Macro-Micro Personalization Framework for Multimodal-Multitask Learning: Descriptive Study

Empowering Mental Health Monitoring Using a Macro-Micro Personalization Framework for Multimodal-Multitask Learning: Descriptive Study

Participants provided self-annotated emotional states over 2 weeks, creating a rich, multimodal resource for understanding daily mental health dynamics. Developing a macro-micro framework for personalized daily mental health. Our framework develops a multimodal and multitask learning (MTL) strategy, innovatively built global emotion embeddings with individual personalization embedding.

Meishu Song, Zijiang Yang, Andreas Triantafyllopoulos, Zixing Zhang, Zhe Nan, Muxuan Tang, Hiroki Takeuchi, Toru Nakamura, Akifumi Kishi, Tetsuro Ishizawa, Kazuhiro Yoshiuchi, Björn Schuller, Yoshiharu Yamamoto

JMIR Ment Health 2024;11:e59512

Multidisciplinary Design–Based Multimodal Virtual Reality Simulation in Nursing Education: Mixed Methods Study

Multidisciplinary Design–Based Multimodal Virtual Reality Simulation in Nursing Education: Mixed Methods Study

According to user evaluations, multimodal interaction with voice communication and implementation of direct actions in the environment were viewed as positive and provided a moderate level of presence. However, communication accuracy and technical features related to virtual object manipulation required improvement. To the best of our knowledge, this is the first study to combine voice interactions with direct hand manipulations in practical nursing training.

Ji-Young Yeo, Hyeongil Nam, Jong-Il Park, Soo-Yeon Han

JMIR Med Educ 2024;10:e53106

Multimodal ChatGPT-4V for Electrocardiogram Interpretation: Promise and Limitations

Multimodal ChatGPT-4V for Electrocardiogram Interpretation: Promise and Limitations

Accuracy of the multimodal Chat GPT-4 V model in answering multiple-choice questions related to electrocardiogram (ECG) interpretation. The number of correct responses among 3 attempts for each question are shown from left to right. The accuracy rates with at least 1, 2, and 3 correct responses are annotated on the right from the bottom to the top. Different shapes represent different question types. We evaluated Chat GPT-4 V responses using the official reference answers as a standard for reliability.

Lingxuan Zhu, Weiming Mou, Keren Wu, Yancheng Lai, Anqi Lin, Tao Yang, Jian Zhang, Peng Luo

J Med Internet Res 2024;26:e54607

Efficacy of a WeChat-Based Multimodal Digital Transformation Management Model in New-Onset Mild to Moderate Hypertension: Randomized Clinical Trial

Efficacy of a WeChat-Based Multimodal Digital Transformation Management Model in New-Onset Mild to Moderate Hypertension: Randomized Clinical Trial

However, studies comprehensively evaluating personalized hypertension management (which focuses on individual differences, customized approach, and incorporation of multimodal data, digital transformation, and multimodal intervention) using rigorous assessments to determine its effectiveness are scarce.

Yijun Wang, Fuding Guo, Jun Wang, Zeyan Li, Wuping Tan, Mengjie Xie, Xiaomeng Yang, Shoupeng Duan, Lingpeng Song, Siyi Cheng, Zhihao Liu, Hengyang Liu, Jiaming Qiao, Yueyi Wang, Liping Zhou, Xiaoya Zhou, Hong Jiang, Lilei Yu

J Med Internet Res 2023;25:e52464

The Validation of Automated Social Skills Training in Members of the General Population Over 4 Weeks: Comparative Study

The Validation of Automated Social Skills Training in Members of the General Population Over 4 Weeks: Comparative Study

We predicted the ratings based on user behavioral indicators using multimodal features (Praat [24], Open Face [25], and Open Pose [26]) and BERT [22] similarity scores between the utterances spoken by the conversational agent and the users. Random forest predicted the scores using these features, and 2 psychiatrists rated the ground truth of these values.

Hiroki Tanaka, Takeshi Saga, Kota Iwauchi, Masato Honda, Tsubasa Morimoto, Yasuhiro Matsuda, Mitsuhiro Uratani, Kosuke Okazaki, Satoshi Nakamura

JMIR Form Res 2023;7:e44857

Impacts of Digital Care Programs for Musculoskeletal Conditions on Depression and Work Productivity: Longitudinal Cohort Study

Impacts of Digital Care Programs for Musculoskeletal Conditions on Depression and Work Productivity: Longitudinal Cohort Study

Previously, we reported a multimodal digital care program (DCP) that integrates physical therapy exercise-based management with a psychoeducational component, including CBT, that aims to encourage patients to develop self-management skills and strategies for their pain. This DCP has been validated in different MSK conditions in chronic [37], acute [38,39], and postsurgical contexts [40-43].

Fabíola Costa, Dora Janela, Maria Molinos, Robert Moulder, Vírgilio Bento, Jorge Lains, Justin Scheer, Vijay Yanamadala, Steven Cohen, Fernando Dias Correia

J Med Internet Res 2022;24(7):e38942

Multimodal Assessment of Schizophrenia and Depression Utilizing Video, Acoustic, Locomotor, Electroencephalographic, and Heart Rate Technology: Protocol for an Observational Study

Multimodal Assessment of Schizophrenia and Depression Utilizing Video, Acoustic, Locomotor, Electroencephalographic, and Heart Rate Technology: Protocol for an Observational Study

We present a research protocol to assess the efficacy of a multimodal sensor system combining video, audio, actigraphy, noninvasive EEG, and heart rate monitoring to assess differences in individuals with schizophrenia and unipolar major depressive disorder and controls (patients with no history of mental illness in the preceding year).

Robert O Cotes, Mina Boazak, Emily Griner, Zifan Jiang, Bona Kim, Whitney Bremer, Salman Seyedi, Ali Bahrami Rad, Gari D Clifford

JMIR Res Protoc 2022;11(7):e36417