• Sterile_Technique@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    13 hours ago

    Recent nursing school graduate here! We had a lot of assignments to find and present data on some disease process or drug or intervention etc. Actually finding credible sources and picking out the data we need and putting it on paper is a super tedious process, and my classmates LOVED zipping through that stuff with some AI shit. And they’d get 100s on their assignments, and everything was just rainbows and unicorn farts… up until test day, where they’d fail or barely pass. Now several of them are struggling to pass the NCLEX.

    Drives me insane. Like, you mother fuckers aren’t here to get a grade, you’re here to learn this shit so you know what to do when you see it in whatever hospital hires your dumb ass.

    Definitely doesn’t paint a pretty picture about the future of medicine.

     

    Why come you no have tattoo??

    • faythofdragons@slrpnk.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 hours ago

      Drives me insane. Like, you mother fuckers aren’t here to get a grade, you’re here to learn this shit so you know what to do when you see it in whatever hospital hires your dumb ass.

      This is happening because the job market is absolutely fucked. Students are under the impression that grades are what will drive job prospects, because nobody is hiring on merit any more.

      My SIL has been a nurse in the cardiac surgery department for nearly a decade, and even her hospital is now using AI to screen potential new hires.

      We’re so cooked.

    • deliriousdreams@fedia.io
      link
      fedilink
      arrow-up
      4
      ·
      12 hours ago

      My hope is that the ones who don’t build the skills to work in medicine don’t pass. Because at least then they don’t get to make decisions that affect a person’s health (even in non-life or death situations).

      But my trust in schools is waining as more and more of them sign up for chatgpt and other LLM’S, essentially forcing them on students.

      The entire schooling system including post secondary education is handling this pretty poorly from what I can see.

      Using LLM’S to detect if something is plagiarism, using it to detect if something is written by an LLM, using it to detect cheating, using it to write lesson plans, using it to offload work onto that are pretty significant portions of your job, encouraging students to use it without safeguards for making sure they do their own work and their own thinking.

      I can’t imagine going to school in this day and age, and having so many adults speak out of both sides of their mouth about LLM’s this way.

      How can you be a teacher or professor, assigning classwork written entirely by an AI and at the same time tell students to use it “responsibly”.

      We don’t even teach students the pitfalls of it. We don’t express how to use it responsibly. We don’t explain how to spot it, and tools to use to prevent ourselves from falling victim to the worst parts of it.