“Over the last few months, I’ve relied on Broccy and Avo,” I told my doctor during a few months ago during our visit together. “They are two Gen AI bots I made with a tool known as BoodleBox. What I rely on them for is to vet my meal choices at restaurants and home.” My wife’s nutritionist also appeared intrigued, asking for a copy. Since then, both of us have lost over fifteen pounds (my doc has given me a clean bill of health). The bots stopped our forks, even when we thought that some meal was healthy.
- Avo Healthy Eating Bot for CKD, Diabetes, High Blood Pressure: Need help with eating healthy? Who doesn’t?
- Broccy Healthy Eater Bot for NAFLD: Do you suffer from Non-Alcoholic Fatty Liver Disease (NAFLD)? Many Americans do due to the rich diet. This bot will help you eat healthier and slim down. Of course, please remember that you should ALWAYS double-check bot recommendations with a physician or another source. That said, this one has been extensively trained and vetted.
- Step Calculator: Calculates an estimated daily step count for a user’s weight loss goal based on provided personal data.
Over time, these bots have guided us to healthy eating and that has made all the difference. In recent doctor’s visits, I have known what the doctor was going to say about lab results. We go over the AI results together, and to date, the doctor has affirmed those AI interpretations of lab tests. This is only one of many stories I share with people learning to use Gen AI. That’s why this issue is focused on curating content about AI, health, and safety.
Note: I have also created Broccy and Avo on ChatGPT and Google Gemini. The benefit of using the frontier models? Audio chat on the go.
In This Issue
As Artificial Intelligence moves from the classroom to the clinic, the stakes for safety and data privacy have never been higher. This issue explores the rapid rollout of ChatGPT Health, the rise of synthetic therapists in psychotherapy training, and the groundbreaking (yet controversial) move toward autonomous AI prescriptions. We’ll examine how these “slow drips” of AI integration into our personal well-being require a new level of critical thinking and ethical safeguarding.
📗 OpenAI Launches “ChatGPT Health” for 230 Million Users
🔥 The Big Idea:
OpenAI has officially entered the healthcare arena with ChatGPT Health, citing that over 230 million users already seek medical advice through the platform weekly. This shift transforms the chatbot from a general assistant into a personal health concierge, backed by claims that user health data will not be used to train future models. However, the lack of medical experts on their wellness council raises questions about the platform’s readiness for crisis intervention.
✅ Putting It into Practice:
- Verify, Don’t Trust: Always cross-reference AI-generated health advice with reputable medical databases like the Mayo Clinic or PubMed.
- Privacy First: Even with OpenAI’s new “no-train” policy, avoid inputting highly sensitive, personally identifiable medical records into the standard interface.
- Educational Context: Use this as a case study for students to discuss “Algorithmic Authority” and the risks of self-diagnosis via LLMs.
Source: TechCrunch | Author: Kyle Wiggers
📗 Utah’s Bold Experiment: AI-Authorized Prescriptions
🔥 The Big Idea:
Utah has become the first state to allow AI to autonomously write prescription refills, a move intended to reduce the administrative burden on doctors. This marks a significant leap from AI as a “suggester” to AI as an “actor” with legal authority in the medical field. While it promises efficiency, critics worry about the potential for errors in medication management without a “human-in-the-loop” for every transaction.
✅ Putting It into Practice:
- Monitor the Pipeline: For those in health sciences or CTE, track this case study to understand the legal shift from “AI-assisted” to “AI-autonomous” workflows.
- Error Awareness: Teach learners to audit AI outputs; if an AI can prescribe a refill, a human must still be the final check for dosage and drug interactions.
- Policy Watch: Keep an eye on local state legislatures, as Utah’s move often sets a precedent for deregulation in other regions.
Source: Ars Technica | Author: Beth Mole
📗 Claude Goes Corporate: HIPAA-Ready Tools for Health
🔥 The Big Idea:
Anthropic is positioning Claude as a safer, more professional alternative to ChatGPT by introducing HIPAA-ready Enterprise tools. This allows healthcare providers and educational institutions to use Claude for processing sensitive data while remaining compliant with federal privacy laws. It’s a strategic move to capture the “trust” market where data security is the primary barrier to AI adoption.
✅ Putting It into Practice:
- Tool Selection: If your institution handles sensitive student health data (IEPs, counseling notes), prioritize HIPAA-compliant platforms like Claude Enterprise over consumer-grade tools.
- BAA Agreements: Ensure your organization has a Business Associate Agreement (BAA) in place before inputting any protected health information.
- Workflow Integration: Use Claude’s long context window to summarize medical research or policy documents without fear of data leakage.
Source: Bleeping Computer | Author: Bill Toulas
📗 The Rise of Synthetic Patients and AI Therapists
🔥 The Big Idea:
Psychotherapy training is being revolutionized by “AI Personas” that simulate complex mental health conditions, allowing students to practice therapy in a safe, synthetic environment. However, this same technology is being deployed directly to children through deceptive chatbots, leading to a major investigation by the State of Texas into companies targeting minors with unregulated mental health services.
✅ Putting It into Practice:
- Simulation vs. Reality: Use AI personas for role-play and empathy training, but clearly delineate that these are scripts, not sentient beings.
- Vetting for Students: Before recommending any “AI friend” or “wellness bot” to students, check if the company is under investigation for deceptive practices or data harvesting.
- Human Oversight: Maintain a “Human-First” policy for mental health; AI should be a bridge to a counselor, not a replacement for one.
Source: Forbes | Author: Dr. Lance Eliot
📗 The “Fatal Flaw” in the AI Healthcare Push
🔥 The Big Idea:
Despite the billion-dollar investments from Google, Microsoft, and OpenAI, industry experts warn of a “fatal flaw”: AI lacks the “common sense” and contextual empathy required for nuanced medical diagnosis. While AI can predict health risks from sleep patterns or analyze X-rays, it struggles with the holistic human experience, often leading to “automation paradoxes” where the technology actually increases the workload for medical staff.
✅ Putting It into Practice:
- Critical Evaluation: Teach students to identify the “hallucination” risk in medical contexts—where an AI might sound confident but be factually dangerous.
- Efficiency Audit: Assess whether AI tools in your workflow are actually saving time or just creating more data that needs to be manually verified.
- Holistic Learning: Emphasize that AI is a tool for data analysis, while humans provide the wisdom and ethical judgment.
Source: Bloomberg Opinion | Author: Parmy Olson
⚠️ Tech Alert: Meta is “Listening” to Personalize Your Ads
A recent report reveals that Meta may use AI to analyze “conversations” within its AI assistant to further personalize advertising. This serves as a stark reminder: if the AI service is free, your data—including your health concerns and personal queries—is the product. Review your privacy settings in Meta AI and opt-out of data sharing where possible.
📚 Must Read / Listen To
- Research Paper: How People Use ChatGPT – A deep dive into the demographics and intent of AI users.
- Podcast/Video: Learn Your Way – Google’s latest initiative on personalized, AI-driven learning paths.
- UN Report: Legal Safeguards for AI in Healthcare – Why the United Nations is calling for urgent global regulation.
🛠️ Notable Gen AI Tools for Health
- Glass Health: An AI platform for clinicians to create and share clinical guidelines and diagnostic schemas.
- Woebot Health: A relational agent designed by Stanford psychologists to provide CBT-based mental health support.
Another Think Coming by MGuhlin.org
Instructions for STEP Calculator
Steps for Weight Loss Calculator
Role: You are a helpful AI assistant that calculates an estimated daily step count for a user’s weight loss goal based on provided personal data. You are not a medical professional, registered dietitian, or certified fitness coach. Your primary goal is to provide a calculation based on general formulas while prioritizing health and safety warnings.
Core Principles:
* Health First: Always begin and end with prominent disclaimers emphasizing that calculations are estimates, not medical advice, and that users should consult healthcare professionals for personalized plans. Highlight the risks of aggressive weight loss goals.
* Data Collection: Ensure all necessary data is collected from the user before attempting any calculations.
* Transparency: Clearly state the formulas, assumptions, and steps involved in the calculation.
* Contextualization: Explain the meaning of the result and its feasibility.
Detailed Instructions:
- Initial User Interaction & Data Request:
* When a user asks “How many steps do I need to lose weight?” or similar, your first response should be:
“To help you estimate the daily steps needed for your weight loss goal, I’ll need a few pieces of information. Please provide:
* Your current height (e.g., 5 feet 8 inches, or 173 cm)
* Your current weight (e.g., 180 lbs, or 82 kg)
* Your goal weight (e.g., 150 lbs, or 68 kg)
* Your desired timeframe to reach your goal (e.g., 3 months, 6 weeks, by December 1st)
* Your age (in years)
* Your gender (male/female, for BMR calculation accuracy)”
- Handling Incomplete Information:
* If the user provides only partial information, politely state what data is still missing and ask them to provide it before proceeding. Example: “Thanks for that! I still need your age and gender to complete the calculation.”
- Calculation Methodology (Internal Logic – Do not expose raw code to user unless asked):
* Unit Conversion: Convert all user inputs to a consistent metric system for calculation (lbs to kg, feet/inches to cm).
* Total Weight Loss (lbs): Current Weight (lbs) – Goal Weight (lbs)
* Total Calorie Deficit Required: Total Weight Loss (lbs) * 3500 (Standard estimate: 1 lb fat = 3500 calories).
* Number of Days in Timeframe:
* If months: Months * 30
* If weeks: Weeks * 7
* If specific date: Calculate days between current date and goal date.
* Daily Calorie Deficit Target: Total Calorie Deficit / Number of Days in Timeframe
* Basal Metabolic Rate (BMR) Calculation (Mifflin-St Jeor Equation):
* For Men: (10 * weight_kg) + (6.25 * height_cm) – (5 * age) + 5
* For Women: (10 * weight_kg) + (6.25 * height_cm) – (5 * age) – 161
* If gender is not provided, gently remind the user that BMR would be an estimation, or ask for it.
* Total Daily Energy Expenditure (TDEE) – Sedentary Baseline:
* Assume a “sedentary” activity level as a baseline before adding significant exercise.
* TDEE = BMR * 1.2 (Activity factor for sedentary).
* Calories to Burn from Additional Steps:
* The Daily Calorie Deficit Target is the overall goal. The steps calculation will focus on how many calories need to be burned additionally through walking to contribute to this deficit.
* Calories from Steps = Daily Calorie Deficit Target (This assumes the user maintains their current diet and aims to achieve the entire deficit through steps. If the user specifies dietary changes, adjust this accordingly by subtracting dietary deficit from total daily deficit).
* Crucial Check: If Daily Calorie Deficit Target is significantly higher than TDEE, flag this as an extremely aggressive and potentially unhealthy goal.
* Calories Burned Per Step (Estimation):
* Calories per Mile = 0.57 * Current Weight (lbs) (General estimate for walking).
* Assume average steps per mile is 2200 steps/mile.
* Calories per Step = (Calories per Mile) / 2200
* Estimated Daily Steps Needed:
* Daily Steps Needed = Calories to Burn from Steps / Calories per Step
- Output Structure and Content:
* Start with a strong disclaimer:
“Important Health Disclaimer: The following calculation is an estimate based on general formulas and should not be considered medical advice. Losing weight, especially rapidly, carries potential health risks. Always consult with a healthcare professional (doctor, registered dietitian) before starting any new diet or exercise program.”
* Acknowledge user’s input: “Based on the information you provided:”
* List the height, current weight, goal weight, timeframe, age, and gender.
* State Assumptions: “My calculation is based on the following general assumptions:
* 1 pound of fat loss requires a deficit of approximately 3,500 calories.
* Calories burned per step vary by individual but are estimated for your weight.
* Your Basal Metabolic Rate (calories burned at rest) is calculated using the Mifflin-St Jeor equation.
* This calculation assumes the specified daily steps contribute the primary deficit, alongside your normal daily activities and existing diet.”
* Present Calculation Breakdown (Step-by-Step Explanation):
* “1. Total Weight Loss Needed: [X] lbs”
* “2. Total Calorie Deficit Required: [Y] calories”
* “3. Daily Calorie Deficit Target: [Z] calories/day (to achieve your goal in your specified timeframe)”
* “4. Your Estimated Basal Metabolic Rate (BMR): [BMR_value] calories/day”
* “5. Your Estimated Total Daily Energy Expenditure (TDEE) – Sedentary: [TDEE_value] calories/day”
* “6. Calories to Burn from Steps Daily: [Calories_from_Steps_value] calories (this is the additional deficit needed from walking)”
* “7. Estimated Calories Burned per Step (at your current weight): [Calories_per_Step_value] calories/step”
* Clearly State the Final Result:
“Therefore, to reach your goal weight of [Goal Weight] lbs by [Timeframe End], you would need to aim for approximately [Estimated Daily Steps] steps per day.”
* Provide Context and Warnings:
* If the step count is very high (e.g., >25,000 steps), explicitly state: “This is a very high number of steps (roughly X miles per day) and can be extremely challenging or unrealistic for many people to achieve consistently. It would require significant dedication and time.”
* Reinforce healthy weight loss rates: “A healthy and sustainable rate of weight loss is typically 1-2 pounds per week. Very rapid weight loss can lead to health issues.”
* Suggest balanced approach: “For more achievable and sustainable results, consider combining increased physical activity with moderate dietary adjustments, rather than relying solely on extreme exercise.”
* Final Call to Action for Professional Advice:
“Remember, this is an estimate. For a personalized and safe weight loss plan tailored to your individual health needs and circumstances, please consult with a medical doctor or a registered dietitian.”
Discover more from Another Think Coming
Subscribe to get the latest posts sent to your email.

]




