Enhance Healthcare Interactions and Documentation with Amazon Bedrock and Amazon Transcribe through the Live Meeting Assistant

Enhance Healthcare Interactions and Documentation with Amazon Bedrock and Amazon Transcribe through the Live Meeting AssistantMore Info

In today’s healthcare landscape, physicians devote nearly 49% of their working hours to documenting clinical visits, which significantly hampers both productivity and the quality of patient care. Surprisingly, for every eight hours that office-based physicians schedule for patient interactions, they spend over five hours managing Electronic Health Records (EHR). This has led many healthcare professionals to seek conversational intelligence solutions that automatically transcribe doctor-patient conversations and transform them into clinical documentation using advanced artificial intelligence (AI) technology, thereby streamlining processes.

The Live Meeting Assistant (LMA) for healthcare, powered by generative AI and Amazon Transcribe, offers real-time support and automated clinical note generation during virtual patient consultations. Initially designed for real-time transcription during virtual meetings, as highlighted in the launch blog post, the healthcare version of LMA has been customized to automatically generate clinical notes during doctor-patient discussions. It captures speaker audio and metadata directly from browser-based platforms (currently compatible with Zoom and Chime, among others) and can also process audio from various softphones or other audio inputs. With the help of Amazon Transcribe, the solution accurately converts speech to text and employs foundation models (FMs) from Amazon Bedrock to create tailored clinical notes in real time.

By utilizing the LMA for healthcare, practitioners can provide personalized recommendations, thereby improving care quality. This solution alleviates the documentation burden by generating draft clinical notes for EHRs or other downstream applications, allowing clinicians to start with a preliminary draft rather than drafting from scratch and simply reviewing and making adjustments. This shift not only grants healthcare professionals more time to devote to patient care but also reduces the risk of clinician burnout.

Explore the following demo, which illustrates the LMA for healthcare in action during a simulated patient interaction.

What Distinguishes AWS HealthScribe from the LMA for Healthcare?

AWS HealthScribe is a fully managed, API-based service that generates preliminary clinical notes offline after patient visits, aimed at application developers. It has undergone extensive testing against diverse datasets to minimize inaccuracies and ensure that each summary sentence is linked to the original transcript through evidence mapping, which is essential for effective review and accuracy validation.

Conversely, the LMA for healthcare is an open-source end-to-end application that serves as a virtual assistant for clinicians, enhancing productivity and relieving administrative burdens, including clinical documentation. Utilizing various AWS services, it provides a real-time transcription and generative AI experience out of the box and can be customized to meet specific needs and create tailored integrations. While LMA’s flexibility is an asset, achieving accuracy, reducing hallucinations, and ensuring evidence mapping requires more effort compared to the robust framework offered by AWS HealthScribe. In the future, we anticipate that LMA for healthcare will integrate with the AWS HealthScribe API alongside other AWS services.

Solution Overview

All components you need are available as open source in our GitHub repository and are easy to deploy in your AWS account. To make use of this sample application, you’ll need an AWS account and an AWS Identity and Access Management (IAM) role with appropriate resource management permissions. If you haven’t set up an AWS account yet, you can do so by following the instructions in How do I create and activate a new AWS account?

Follow the guidelines in Deploy the solution using AWS CloudFormation outlined in this LMA blog post to begin deployment. When deploying the LMA for healthcare, select Healthcare from the dropdown menu in your domain.

This blog post details deployment steps, including downloading and installing the Chrome browser extension, initiating LMA usage, process flows, monitoring and troubleshooting procedures, cost evaluation, and customization options for your deployment.

Focusing on the Amazon Transcribe LMA solution for healthcare, the Live Meeting Assistant facilitates effective documentation post patient visits. It automatically generates thorough post-call summaries, emphasizing key discussion points between the clinician and the patient, and presents clinical notes in structured formats like SOAP (Subjective, Objective, Assessment, Plan) and BIRP (Behavior, Intervention, Response, Plan). It can also summarize ongoing discussions, pinpoint key topics mentioned, and catalog patient symptoms as they arise during the conversation using the meeting assist bot.

By selecting ASK ASSISTANT, healthcare professionals can prompt the meeting assistant to access an Amazon Bedrock knowledge base (if enabled) and propose appropriate responses based on recent meeting interactions captured in the transcript. This prompting technique is pivotal in natural language processing (NLP) as it guides the model to generate relevant and coherent outputs.

The Amazon Bedrock knowledge base enables the consolidation of various data sources into a centralized repository. This allows for the creation of applications using Retrieval-Augmented Generation (RAG), where information retrieval enhances the model’s response generation. With the LMA, you can integrate your organization’s data into an Amazon Bedrock knowledge base, which can even crawl external websites, helping it retrieve pertinent information relevant to the conversation during patient visits. For example, CDC website.

In the following scenario, research documents related to social anxiety were added to the Amazon Bedrock knowledge base, allowing reference to this information during live patient interactions. To activate the assistant, say “Okay, Assistant,” choose the ASK ASSISTANT! button, or type your question in the UI. In the following figure, we inquired about available research papers on social anxiety from the documents we provided during setup.

As seen in the figure, the meeting assist bot successfully answered the query posed during the live call: “Okay, Assistant, is there any case study reference on social anxiety?” The bot provided a relevant response by citing a source from the Amazon Simple Storage Service (Amazon S3) where the reference documents are stored.

Note: We recommend using the Amazon Bedrock knowledge base solely for information retrieval and search, not for generating direct recommendations regarding patient care.

For further insights, you can read another informative blog post here. Additionally, for comprehensive guidance on this subject, check out this resource, they are an authority on this topic. Lastly, for community-driven discussions, visit this excellent resource.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *