Monday, 1 September 2025

Building a Voice-Based Health Education Assistant with React, AWS, and ChatGPT

Imagine a patient speaking into a mobile app and instantly receiving a clear, natural-sounding explanation about their health concern. With advances in speech recognition, natural language processing, and text-to-speech, this is no longer futuristic— we will see a demo of this implementation.

In this post, we’ll explore how to architect and implement a voice AI chatbot for health education that:

  1. Listens to a user’s health question.

  2. Transcribes it into text.

  3. Queries an AI model (ChatGPT) for an answer.

  4. Converts the response back into natural speech.

  5. Plays it back to the patient.


Why Voice for Health Education?

  • Accessibility: Voice removes literacy barriers and makes health information more inclusive.

  • Convenience: Patients can interact hands-free.

  • Engagement: Natural conversations feel more intuitive than reading long articles.


High-Level Architecture

Here’s the flow of the system:

  1. User speaks into a React app.

  2. The app streams the audio via Socket.IO to AWS Transcribe.

  3. AWS Transcribe converts speech to text and returns it.

  4. The transcription is sent to ChatGPT API.

  5. ChatGPT generates a patient-friendly answer.

  6. The answer text is sent to AWS Polly.

  7. Polly generates natural voice audio.

  8. The React app plays the audio back to the user.






Demo Video





Note: This architecture is for demo purpose and not for production use.


Monday, 25 August 2025

Enhancing Healthcare Interoperability with Robotic Process Automation (RPA) Using Web-Based EHRs

In the realm of healthcare, seamless data exchange is critical — not just for compliance with regulations  but also for improving patient care. While HL7 FHIR APIs are becoming the gold standard for interoperability, many legacy and even modern web-based EHR systems still lack comprehensive APIs. This is where Robotic Process Automation (RPA) can bridge the gap.

It's important for the Healthcare interoperability devs to be aware of current advancements in the interoperability space and hence I thought to bring this post to you.

In this post, we’ll explore how RPA can be used in healthcare interoperability, particularly for automating interactions with web-based systems that don’t expose APIs. We’ll demonstrate a proof of concept using TypeScript and Puppeteer that calculates a cancer risk score for a patient via a web form and extracts the result automatically. We will use https://ccrisktool.cancer.gov/calculator.html as a demo site, it has a form that one needs to fill and it calculates your cancer risk score based on inputs. This site has a form structure complex enough to demonstrate different kind of fields for RPA and simple enough for the purpose of this demo.


Why Use RPA in Healthcare Interoperability?

RPA involves software robots that mimic human interactions with a user interface. In healthcare, RPA can:

  • Automate data entry from FHIR servers into web-based EHRs.

  • Extract lab results, risk scores, or notes from legacy systems.

  • Bridge non-API workflows between systems.

  • Reduce clerical burden on clinical staff.

This is particularly useful when dealing with external systems that offer valuable tools but no APIs — like online risk calculators, insurance portals, or registry forms.


Use Case: Automating a Cancer Risk Assessment Tool

Imagine you want to integrate an online cancer risk calculator with your interoperable workflow. Instead of waiting for an official API, we can use Puppeteer build a backend automation that will expose this workflow as a RESTful API

  1. POST a RESTful API request to our server

  2. Server in background will launch the risk calculator webpage https://ccrisktool.cancer.gov/calculator.html

  3. Fill out a form based on request data received on our RESTful API

  4. Submit the form.

  5. Scrape and store the result (e.g., a 5-year risk score).

  6. Return the result as a JSON to the RESTful API call


Nodejs source code for this demo implementation is available on my GitHub: https://github.com/j4jayant/rpa-puppeteer-demo


Sample Postman Call screenshot:






Sample Website Output Page:







Disclaimer:
The proof of concept shown here is for educational purposes only. Automating interactions with third-party websites using RPA should never be done without the explicit approval of the site owner or developer. Unauthorized automation may violate the site’s terms of service, privacy obligations, or applicable laws. Always seek proper permissions, ensure compliance with healthcare data protection regulations (e.g., HIPAA), and use such approaches responsibly in production environments.  I used this website for the demo purpose and I don't encourage people using this site unnecessarily to play around.






Monday, 2 June 2025

MCP + FHIR - Example MCP Server integrated with FHIR

In the ever-evolving landscape of healthcare data, interoperability is paramount. FHIR has emerged as a leading standard for exchanging healthcare information, but raw FHIR interactions can sometimes feel a bit… bare specially when working with AI Apps that use LLMs extensively. What if we could add a layer of context, logic, and user-friendly abstractions on top of FHIR? This is precisely where an MCP (Model Context Protocol) server comes into play.

In this post, we'll explore the implementation of an MCP server, built with TypeScript, that seamlessly integrates with a FHIR server (specifically tested with Aidbox). This server will empower applications to not only create FHIR resources with contextual understanding but also to read and interpret specific FHIR resources with ease.

The source code for this quick and dirty implementation of MCP + FHIR integration is available on my github: https://github.com/j4jayant/mcp-fhir-server

This implementation is inspired by Aidbox article: https://www.health-samurai.io/articles/mcp-fhir-server and the related articles of Dr. Pawan Jindal on LinkedIn.

I thought of implementing the same as was able to achieve the similar results. 

This MCP server has couple of tools:

1. create-fhir-resource

    This tool takes a couple of parameters

  • ResourceType (string like Patient, Appointment etc.)
  • ResourceBody (string with raw JSON of the resource)

    Sample Request and Output in Claude Desktop

    


        

2. read-fhir-resource

This tool takes a couple of parameters

  • ResourceType (string like Patient, Appointment etc.)
  • ResourceID (string with ID of the resource)

    

        Sample Request and Output in Claude Desktop






The MCP server receives the requests from LLMs and sends the request to FHIR server. 

This implementation is tested with Claude Desktop and Aidbox FHIR server.

The Claude configurations to run this server locally can be copied from the Aidbox article.


Thursday, 17 April 2025

Simple Healthcare Workflow Automation: Delivering Personalized Education Post Doctor Consultation with n8n

In today's healthcare landscape, empowering patients with relevant information is crucial for better health outcomes and engagement. What if we could seamlessly deliver diagnosis-specific educational materials to patients immediately after their doctor's appointment? 

This blog post explores how to implement an AI agent using the powerful automation platform n8n.io to achieve just that. By leveraging HL7 messaging, OpenAI's MiniGPT-4 model, and SendGrid, we can create a truly personalized and efficient patient education workflow.

For this demo, I have used following tools:

  • n8n (to create an  AI Agent to automate the workflow)
  • OpenAI's MiniGPT-4 model (to process the prompt and generate relevant educational content)
  • Mirth Connect (to simulate receiving HL7 SIU messages from EHR with Checked-out status. To keep it simple, we assume the SIU feed as DG1 segment with encounter diagnosis)
  • SendGrid (to send formatted email to the patient)


Here's a breakdown of the workflow:




n8n Workflow Execution Diagram

  • HL7 Trigger: The workflow (outside of our AI agent) begins with incoming SIU S14 HL7 messages on Mirth connect server. We configure this channel to specifically look for messages with the CHECKEDOUT status. This ensures the workflow only activates after a patient has completed their appointment. Mirth Connect server then parses the minimum required fields from the HL7 message and transforms a JSON. Mirth calls n8n webhook and posts the JSON.

  • n8n Webhook: The workflow (withing our AI agent) begins with an n8n trigger node (Webhook) that listens for incoming messages from Mirth Connect. This Webhook receives the request JSON like this:

        


  • Patient Information Extraction: Once a relevant message is received, we use n8n's data manipulation nodes to parse the JSON message and extract key patient information, such as:

    • Patient Name, Gender, DOB
    • Patient Email Address
    • Diagnosis Code (e.g., ICD-10 code)
  • AI-Powered Prompt Generation (OpenAI MiniGPT-4): This is where the intelligence comes in. We'll integrate with the OpenAI API, specifically utilizing the MiniGPT-4 model. We'll construct a dynamic prompt based on the extracted diagnosis. For example:

    please suggest some patient education materials for Mr. {{ $json.body.firstName }} {{ $json.body.lastNName }} suffering from {{ $json.body.diagnosis }} whose date of birth is {{ $json.body.dob }} and gender is {{ $json.body.gender }}. Please format the response in HTML that can be sent as an email to the patient.

    The MiniGPT-4 model will then process this prompt and generate relevant educational content.

  • Email Sending (SendGrid Integration): Finally, we'll integrate with SendGrid, a reliable email delivery service. We'll configure the SendGrid node in n8n to send an email to the patient's extracted email address. The email body will contain the formatted educational materials generated by the AI. The email content looks like this:



The prompt and HTML output are just an example. The output would be as good as your prompt and LLM model.



Most of the text of this blog post is generated using Google Gemini and edited to suite the actual implementation.








Thursday, 13 March 2025

Appointment Scheduler AI Chatbot with FHIR Integration

In today's fast-paced world, convenience is paramount, especially when it comes to healthcare. Imagine being able to book a doctor's appointment simply by having a conversation with a chatbot. This is now a reality, thanks to the power of Dialogflow and FHIR (Fast Healthcare Interoperability Resources).



This blog post will explore how we can create a seamless doctor appointment booking experience using Dialogflow, a natural language understanding platform, and FHIR, a standard for exchanging healthcare information electronically.


Here I will showcase a simple AI chat bot to book a doctor appointment and show how integration works from behind and how AI simplifies the process on front for a seamless user experience.


I have used following to develop this demo

Here Dialogflow ES simplifies the ML training, intent detection based on user inputs and Medplum FHIR server simplifies the FHIR related API stuff.


Our chat bot is developed for an imaginary ABC Pain Clinic. This clinic has couple of orthopaedic providers. For this demo we assume the patient is already registered (we won't capture patient demographics etc.) and provider schedule/slots are already configured on our FHIR server.

In the screenshots, the text displayed with right alignment with background color of gray is the user input and text / rich text displayed with left alignment is the bot response.

Here are the workflow images to explain the Bot Intents and FHIR Integration











Here are actual Chatbot screenshots




In the first step the bot greets the user.



























Thursday, 30 August 2018

Sample Immunization Records Blockchain using Hyperledger Composer

This is basic and sample Blockchain implementation of Immunization Records using Hyperledger Composer. 

Popular Posts