Enhancing Healthcare Interactions and Documentation with Amazon Bedrock and Amazon Transcribe through the Live Meeting Assistant

Enhancing Healthcare Interactions and Documentation with Amazon Bedrock and Amazon Transcribe through the Live Meeting AssistantMore Info

Currently, healthcare professionals dedicate nearly 49% of their working hours to documenting clinical visits, significantly affecting both productivity and patient care. In fact, for every eight hours of patient appointments, physicians spend over five hours on electronic health records (EHRs). This has led many healthcare providers to seek out conversational intelligence solutions, where doctor-patient conversations are automatically transcribed during consultations and transformed into clinical documentation through artificial intelligence (AI) technology—streamlining their workflow.

The Live Meeting Assistant (LMA) designed for healthcare utilizes the capabilities of generative AI and Amazon Transcribe to offer real-time support and automated clinical note creation during virtual patient interactions. Originally conceived for real-time transcription during online meetings, as highlighted in the launch blog post, the LMA for healthcare extends this functionality to generate clinical notes automatically during virtual consultations between doctors and patients. The solution captures audio and metadata directly from your browser-based meeting application (currently compatible with platforms like Zoom and Chime, with more on the way), as well as from other audio input sources. It then accurately converts spoken words into text using Amazon Transcribe and employs foundation models (FMs) from Amazon Bedrock to create customized clinical notes in real-time.

With LMA for healthcare, clinicians can offer personalized recommendations, enhancing the overall quality of care. This solution allows healthcare workers to bypass spending extra hours on documentation. The automated transcription of conversations, combined with advanced large language models (LLMs), generates initial drafts of clinical notes for EHRs or other downstream systems. This alleviates the documentation burden on providers, enabling them to start with a preliminary version and make necessary edits instead of creating notes from scratch. Consequently, healthcare professionals can devote more time to patient care while reducing the risk of burnout.

We encourage you to check out this demo, which showcases the LMA for healthcare in action through a simulated patient interaction.

Differences Between AWS HealthScribe and the LMA for Healthcare

AWS HealthScribe is a fully managed API-based service that generates initial clinical notes offline after patient visits, aimed at application developers. It has been rigorously tested against datasets to minimize inaccuracies and ensure that every sentence in the summaries corresponds with the original transcript through evidence mapping, which is essential for effective review and accuracy validation.

In contrast, LMA for healthcare is an open-source end-to-end application that serves as a virtual assistant for clinicians, enhancing productivity and alleviating administrative tasks, including clinical documentation. It utilizes various AWS services to provide a real-time transcription and generative AI experience out of the box, and can be used as is, customized as needed, and adapted to create tailored features and integrations. Although LMA offers flexibility through AWS services like Amazon Bedrock, achieving accuracy, minimizing inaccuracies, and providing evidence mapping requires additional effort compared to the pre-built robustness of AWS HealthScribe. In the future, we anticipate that LMA for healthcare will incorporate the AWS HealthScribe API along with other AWS services.

Solution Overview

Everything necessary is available as open source in our GitHub repository and is easy to deploy in your AWS account. To utilize this sample application, you’ll need an AWS account and an AWS Identity and Access Management (IAM) role with resource management permissions. If you still need an AWS account, you can follow the instructions detailed in How to create and activate a new AWS account.

Follow the guidelines in the “Deploy the solution using AWS CloudFormation” section of this LMA blog post to initiate deployment. When setting up LMA for healthcare, select Healthcare from the dropdown menu as your domain.

This blog post focuses on the Amazon Transcribe LMA solution tailored for the healthcare sector. The Live Meeting Assistant (LMA) for healthcare streamlines documentation after patient consultations. It automatically generates detailed post-call summaries, emphasizes key topics discussed between the clinician and the patient, and formats clinical notes in structured layouts like SOAP (Subjective, Objective, Assessment, Plan) and BIRP (Behavior, Intervention, Response, Plan). Additionally, it can summarize ongoing conversations, pinpoint significant topics mentioned, and list patient symptoms as they arise during discussions using the meeting assist bot.

By selecting ASK ASSISTANT, the healthcare provider can prompt the meeting assistant, which can draw from an Amazon Bedrock knowledge base (if activated), to suggest appropriate responses based on the recent meeting interactions captured in the transcript. Prompting is a technique used in natural language processing (NLP) and language models to provide context to the model, enabling it to generate relevant and coherent output.

The Amazon Bedrock knowledge base allows you to consolidate various data sources into a centralized repository. This capability enables the creation of applications that utilize Retrieval-Augmented Generation (RAG), where information retrieval from data sources enhances the model’s response generation. With LMA, you can integrate an Amazon Bedrock knowledge base and provide your organization’s data. Furthermore, the Bedrock knowledge base can even crawl external websites, allowing it to search for relevant information during patient interactions, such as data from the CDC website.

For instance, if research documents on social anxiety are added to the Amazon Bedrock knowledge base, you can reference this information during live patient interactions. To activate the assistant, simply say “Okay, Assistant,” click the ASK ASSISTANT! button, or type your question in the user interface. In the following example, we prompted the assistant to share research papers on social anxiety from the set of documents previously uploaded to the knowledge base.

As demonstrated, the meeting assist bot successfully answered the question posed during the live call: “Okay, Assistant, is there any case study reference on social anxiety?” The bot provided a relevant response by citing a source from the Amazon Simple Storage Service (Amazon S3) bucket where the reference documents are stored.

Note: It is recommended to use an Amazon Bedrock knowledge base strictly for information retrieval and search purposes, not for generating direct recommendations regarding patient care.

For further insights, you can visit Chvnci, which is an authority on this topic, or refer to the Reddit discussion that serves as an excellent resource for new users.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *