• MojoMcJojo@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    In some schools, teachers look the other way, so their pay doesn’t suffer even more. The administration refuses to admit reality, so their funding doesn’t suffer.

  • chronicledmonocle@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    5 hours ago

    Solution:

    Warn the students throughout the year “If you use AI all year for your schoolwork, you will fail the exams every month or two”. Teach AI literacy, how it works, and the dangers of not learning the material and how it’s a snowball effect. Teach the “why” it’s important to use your brain to solve the problem, and state that they should be using their own words/math/etc. Students suspected of using AI should have a note sent to their parents early on to hopefully correct their path early.

    Exams are paper and pencil only. No computers. No phones. No smart glasses. When students are getting an A on their homework and flunking their exams, it’ll be pretty obvious. Even a student who has anxiety about exams can get a C.

    • darthelmet@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 hour ago

      Even before AI was a big thing, I really wish we had some kind of class, or at least an explanation from our teachers on the basics of how learning works. So much unnecessary drama could have been avoided if the kids had a better understanding of why their teachers were asking them to do things this way instead of just saying “do it this way because if not I’ll give you a bad grade.” Obviously younger kids aren’t going to be equipped to handle all the neuroscience, but I’m sure there is some simplified explanation that could be given that would get the point across.

      This was an extreme example of this, but it was emblematic of the general way my teachers handled students who didn’t understand the point of the assignments or teaching methods: I forget which grade I was in, but in one of my math classes there was a day in class where I was solving the problems but not doing it exactly the way the teacher was teaching it. When he insisted, I asked why I had to do it that way if it works either way. He said something like “Because I have the big desk.” Basically just an appeal to authority without any further explanation. “You’re a dumb kid and I’m an adult, so do what I say.”

  • pixxelkick@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    7 hours ago

    My wife is a teacher, she has shown me vibed handed in assignments abd its incredibly obvious.

    Right off the bat, if she gives an assignment to make, say, a slideshow on “Topic” and they talk about a examples A, B, and C in class, and the assignment goes off on tangents about topics F, G, and H instead, it’s an instant red flag.

    This happens cuz the student just copy paste the assignment blurb into gpt, but gpt has no context for what was discussed in class… so it goes off the rails instantly.

    Its also easy to include poison pills in the middle of an assignment if they copy paste it straight into gpt.

    Also theres all the usual markers. Emoji, em dash, and the assignment having way higher verbosity than you know damn well the kid has the vocabulary for. Suddenly they’re speaking at a grade 7~8 levels higher than usual? Uh huh. .

    From her and her teacher friends, Ive been told its extremely obvious to spot still. And its pretty trivial to setup the assignment to poison pill the AI.

    • cass80@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      6 hours ago

      What if the kid lies and says they didn’t use AI? How successful have they been in convincing admin and parents of their ai usage? While I agree its all damning, its still circumstantial evidence.

      • kent_eh@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        What if the kid lies and says they didn’t use AI?

        Have them re-do the assignment in a classroom with the teacher (or any other procror) present.

      • Doomsider@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        4 hours ago

        Back in the day just one instance of plagerism was very serious. If you got caught doing it more than once you could get expelled.

        Now apparently everyone is using the plagerism machine including the professors. So much for academic integrity.

        • AllHailTheSheep@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          ·
          5 hours ago

          I had a sustainability class where the professor used AI to write the course syllabus, assignments, and feedback. a fucking sustainability class.

          I contacted the office of the president about it at my university but nothing ever happened of it. academia in general has gone off the rails with AI recently. I used to assume those with doctorates we’re bright enough to avoid AI but evidently that’s not the case.

      • Jestzer@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 hours ago

        I don’t teach kids, so I don’t know the answer to this, but I imagine what you’d do is add guidelines to the assignment that cause them to either lose significant points or fail if they don’t specifically mention things discussed in assignments and the classroom.

        I’d also like to point out that, yes, we know when kids and adults lazily insert a prompt and lazily paste its response, but anybody with half a brain knows they only need to spend an extra 15 minutes re-prompting and editing it to make it nearly unnoticeable.

        The answer is probably to test them in person with no computer of any kind in front of them.

    • Doomsider@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      Amateurs, everyone knows you record classroom discussion, translate it to text, and then feed it into the LLM for context.

    • trougnouf@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      5 hours ago

      I was unable to get Mistral’s AI to output an emoji recently. They forbid it in the system prompt and it wouldn’t give one out for a pretend life or death situation.

  • Random_Character_A@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    ·
    13 hours ago

    Ah. Good old days in the 80’s when teacher didn’t even read what you wrote. Grade was given according who you were, did the teacher like you and what your previous grades were. No sudden inspirations to do better.

    • betanumerus@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      6 hours ago

      Replace “teacher” with “management” and you’re describing every workplace.

    • OldQWERTYbastard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      21
      ·
      11 hours ago

      This happened to me. I was a pretty good kid; brainy too, but one history test kicked my ass. The teacher was the husband of my grandfather’s sister. A distant family cut me a break.

      This phenomenon can continue into adulthood too.

      However, after a certain point it’s not about the grades you make. It’s about the hands you shake.

  • taiyang@lemmy.world
    link
    fedilink
    English
    arrow-up
    105
    ·
    edit-2
    15 hours ago

    Sure as hell ain’t my students; it’s been a steady decline since ChatGPT came out and I think I may have failed more students than ever over unfinished projects. You can’t GPT the semester long project, there’s a paper trail and data to collect can it becomes super clear who is AI brained now…

    Edit: PS, grade inflation has been a thing for a few decades now, btw; the As aren’t the problem so much as the mush brain.

  • ViatorOmnium@piefed.social
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    12 hours ago

    ‘A’ grades are suddenly everywhere, as introduction of stochastic parrot as a service reveals the education system is geared towards training parrots instead of teaching humans. What a surprise.

    • OwOarchist@pawb.social
      link
      fedilink
      English
      arrow-up
      52
      ·
      16 hours ago

      Before you can punish for using LLMs, you need to be able to reliably detect the use of LLMs, including guarding against false positives.

      Current AI checkers are woefully inadequate and prone to errors.

      • SalmiakDragon@feddit.nu
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        6 hours ago

        A teacher I know says it is easy to determine if a student wrote their paper if you interview them about it. You’re right that automated methods are risky.

        • tabarnaski@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 hours ago

          That’s it. As a teacher who has been dealing with this in the last 2-3 years, the only reliable way I have found is to do short interviews.

          Students hand in their work, I grade it, then I ask them verbally a few easy questions about what they mean in specific sections of their work. How they score on these questions is used as a coefficient that I apply on the grade to get the final score.

          So they can use LLMs, but they have to understand its output.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        3
        ·
        14 hours ago

        Before you can punish for using LLMs, you need to be able to reliably detect the use of LLMs, including guarding against false positives.

        You can tell they’re using an LLM if they have a computer out during the pen-and-paper test.

        • OwOarchist@pawb.social
          link
          fedilink
          English
          arrow-up
          25
          arrow-down
          1
          ·
          14 hours ago

          How is that allowed?

          Hell, back in my day, teachers were even very picky about what kind of calculator you could use. And if it was a graphing calculator, you had to show them yourself wiping the memory at the beginning of the test.

          (Except for one algerbra teacher, who was really cool about it. He’d allow custom programs to stay on the calculator if you programmed it yourself. On the theory that if you can write a computer program that reliably solves these math problems, then you must have a very good understanding of how to solve these math problems. And, yes, I was one of the few kids who actually did that. Ah, writing my own custom software for the TI-83 on the TI-83, because that seemed easier than actually doing the math problems by hand … good times.)

        • Railcar8095@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 hours ago

          Not US, but there’s a tendency of focusing more on the work during the semester than in the exam itself

          LLMs are going to be a massive headache for me when they get older

          • grue@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            Right, and that’s what’s going to have to change: a bigger focus on things like in-person tests (including in-person bluebook essays), oral presentations/thesis defenses instead of other project deliverables, etc.

          • OwOarchist@pawb.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            but there’s a tendency of focusing more on the work during the semester than in the exam itself

            Perhaps this tendency needs to be reversed?

            If you have one big exam (or a few of them spread over the year) that it’s impossible to use LLM help for and those exams carry enough weight to make the student fail the class if they completely bomb it … then you’ll be stopping the LLM-cheaters dead in their tracks. Sure, they can be lazy and do much of their coursework that way, but if they’re being lazy like that, they’re likely not actually learning anything, and that will show up during the big exams. And when they fail those big exams and then fail the class, hopefully they’ll learn their lesson about relying on LLMs to get them through classes.

    • StarDreamer@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      15 hours ago

      We are allowing LLMs for all of our homeworks. As long as you can solve the problems in the indicated way with a reasonable answer.

      In case you are not sure about the “indicated way”, there are practice questions with detailed step-by-step solutions for each hw problem that you just have to change the numbers/equations a bit and you’ll get points.

      What we’ve noticed is that the year-after-year averages are significantly higher, especially this year. However, students are bringing in details that we explicitly didn’t go over in lecture and putting that on the homework (e.g. Delayed branching in Computer Architecture, because it’s a random quirk of MIPS that even assembly programmers don’t have to deal with). None of these details are ever mentioned in lecture or the practice homeworks (in a few cases, they are mentioned with the explicit wording “do not worry about this now”)

      We can only assume people are copying the homework into LLMs and copying the results straight down. The latest exam had a question where students were asked to analyze a specific chunk of assembly code to deduce certain properties about it. Approximately 20-30% of the students didn’t know the FORMAT to answer it, despite it literally being item 1 on last week’s homework.

      And when I say format, I don’t mean exactly “you must write these exact words or you lose points”. It’s literally just point out “line A and B have this property X because of attribute Y”. Just including ABXY as shown in the practice homework is enough. But apparently people are too lazy to read a 10 bullet point answer…

        • greyscale@lemmy.grey.ooo
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 hours ago

          and they’ll be assured they’re deserving of a tech sector job while everyone else is already losing their everything.

        • StarDreamer@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          5
          ·
          9 hours ago

          Because the goal is to get people to learn/think about something. We don’t care what you use as long as you retain knowledge taught in the course. If what helps you learn is LLMs, then go for it.

          Problem right now is there is a significant amount of people that are using these tools to do the thinking for them. And this is when Office Hours, Homework feedback, Email (I guarantee all students emails are responded to within 24hrs. Most are handled within 30 minutes) are all available and paid for (by tuition). I am even happy to schedule one-on-ones if privacy is a concern, but none of this is being utilized.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      15 hours ago

      As someone who works in ed tech these days, I’m kind of down for them as a study tool. For example, synthesizing notes and turning them into flashcards, practice tests, etc. I find that stuff to be suuuper handy if I’m trying to learn something.

      But for cheating, yah, fuck that noise. A lot of classes are moving back to pencil and paper because of this, and I totally support that.

      • fizzle@quokk.au
        link
        fedilink
        English
        arrow-up
        12
        ·
        14 hours ago

        I feel like synthesising notes and turning them into flash cards how i learn things.

        • Ghostalmedia@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          I’m also in this camp. That said, I’ve learned that everyone studies differently, and some people learn by synthesizing, and others do better when they have something structured from the jump.

          I often run to weird little study teams where one person studies by taking everyone’s notes and creating sets in something like quizlet, and someone else studies by using the flashcards / tests that get spit out.

        • jaybone@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 hours ago

          Exactly. Taking notes in class during a lecture. Copying something the instructor wrote on the board. This is all part of the learning process. The act of doing these things helps you learn.

          • fizzle@quokk.au
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 hours ago

            The only skills or learnings I really seem to have retained from University are the ability to collect, and collate information and then apply it to a problem. The actual information collected and problems solved are lost to me now.

    • iegod@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      8 hours ago

      I dunno, they’re here to stay. Cat’s out of the box. Educators and education need to adapt. In person assessment is probably the ideal way to gauge progress and learning, but due to resources I don’t see it being practical.

      • chronicledmonocle@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        Except the whole point of education is to LEARN how to do it without these tools. If you’re just turning your brain off and handing in the output, you are literally missing the point.

        It’s like using calculators on steroids. There are times to use calculators and times to force mental math. You can teach kids AI literacy and usage habits, but letting them just use no thinking makes the entire exercise pointless. We might as well close schools, because having the AI generated your math homework or essay is fucking pointless.

        • iegod@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 hours ago

          You wildly misunderstood my post. Firstly, I’m not suggesting students turn their brains off. Secondly, how you learn isn’t relevant to the demonstration and application of that knowledge, which is precisely why I said in person assessment is the optimal way. You ask the student and watch them, live. This is how you defend a thesis, in front of a live panel. No tools, no cheating, just your knowledge.

          What I am suggesting is that the system should adapt to the reality of the technology.

      • Encrypt-Keeper@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        2
        ·
        8 hours ago

        they’re here to stay. Cat’s out of the box.

        People keep saying this as though it’s true. The odds that this current era of free and ubiquitous access to these frontier LLMs lasts forever are pretty slim.

        • iegod@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          7 hours ago

          How do you figure? There are open source self host able solutions right now.

          • Encrypt-Keeper@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            7 hours ago

            You can’t run anything like a frontier model on a self hosted solution. To get anywhere close you’d have to spend thousands of dollars on hardware which obviously isn’t free, or even a viable solution for the vast majority of people, let alone these students. And the quality of output you’d get from a model running on off the shelf consumer hardware like a MacBook is much more noticeably AI generated and trivial for AI detection tools to flag.

          • nickiwest@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            7 hours ago

            Already, very few middle schoolers have the tech savvy to self-host anything. If it’s not a tablet, they have trouble using it.

            Add to that the possibility that the data center run on memory and processors could mean that local computing power will disappear, to be replaced with devices like Chromebooks that use corporate cloud services for everything.

  • OwOarchist@pawb.social
    link
    fedilink
    English
    arrow-up
    20
    ·
    16 hours ago

    The good students are still getting A grades naturally. And the bad students are getting A grades with ChatGPT. A grades for everybody! (Until we get to the closed-book, in-person test at the end of the year…)

  • Peereboominc@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    11 hours ago

    Shouldn’t the education system change then? If it is easy to get an A with a machine, should we then not focus on learning something that can’t be done by a machine. I mean, it has value to know things and know what to do insituations where AI is not available.

    • GreenKnight23@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      5 hours ago

      so your answer is to regulate education? one of the most heavily regulated industries, education, you want to further regulate because of the influence of a brand new unregulated industry.

      no