The summer is over, schools are back, and the data is in: ChatGPT is mainly a tool for cheating on homework.::ChatGPT traffic dropped when summer began and schools closed. Now students are back, and they’re using the AI tool again more.

  • VictorPrincipum@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    4
    ·
    9 months ago

    “You can use AI such as ChatGPT or Copilot on your senior projects, just make sure the code works, you understand it enough to document it, and your sponsor is ok with external code use” - paraphrased from my Software Engineering department head about our senior capstone projects.

    “I have the kids ask ChatGPT for an essay and then have the (8th grade) kids treat it like a rough draft so they have practice editing it” - my English teacher Father

    The best way to handle it is to embrace and use it to augment your skills, much like calculators in math classes.

    • grabyourmotherskeys@lemmy.world
      link
      fedilink
      English
      arrow-up
      34
      ·
      9 months ago

      Both of these methods require the student to understand the work. My old man brain insists they should have to code assembly from scratch and walk through snow storms to a library for their essay research, but in reality this is likely how this technology will be used. It’s a practical approach. The 8th grade version should probably include fact checking.

      • June@lemm.ee
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        9 months ago

        I’m thinking of LLMs like calculators when I was in school.

        It’s good to have a fundamental understanding of how it all works, but let the tool be the workhorse and just learn to validate.

        • gohixo9650@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          2
          ·
          9 months ago

          you can understand how subtraction or multiplication works, but if you don’t do them repeatedly in your head and on paper you will end up needing a calculator for the most ridiculous things. Like getting change or splitting a bill with a friend, or whatever.

          • Zeoic@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            3
            ·
            9 months ago

            To be fair, I dont think the people growing up with calculators in class and using chat gpt will be using cash at all. Not sure the last time I have even held physical currency, honestly. While your splitting a bill part might remain, we dont really need to worry about calculating change anymore.

            • gohixo9650@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              9 months ago

              this is strawman argument. Being able to do basic mathematic operations in your head has nothing to do with using cash or not. By your logic we can stop thinking at all. Calculator for mathematic operations, AI for thinking and talking. We can stop thinking at all since hey,AI is easier

            • radix@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              9 months ago

              I can confirm I literally never calculate change anymore. The only thing I do in my head is calculate a 20% tip.

              (Obligatory screw tip culture.)

      • Moobythegoldensock@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        9 months ago

        Really it needs to start a little younger: in 5th or 6th grade they should be writing short essays in class, by hand, and then move onto outlining for larger essays, and then they can start using AI to do the drafts at home.

        • grabyourmotherskeys@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          9 months ago

          I was definitely outlining and writing essays in early grades but was on an accelerated track. My friends from the neighborhood who went to a different junior high entered high school without ever have done this. That blew my mind at the time and still does today decades later.

          • Moobythegoldensock@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            9 months ago

            It blows mine, too. I remember having to do the outlines and hating it, but it really helps you understand the structure of an essay even if you never write that way again.

            I think outlining will actually become an important tool with generative AI. For example, I used it to generate a letter of recommendation last week. So to do that, I had to:

            • Write a prompt with enough background for the AI to work with, and include all my talking points
            • Generate the output
            • Read over everything to make sure what it generated was relevant and accurate
            • Edit the draft to reflect my voice, add a sentence or two to emphasize things I wanted to stand out, remove some of the fluff, etc.

            It still turned what was probably an hour’s worth of work into 15 minutes, but at least currently you need to understand what you’re doing to use it this way. Specifically, knowing how to outline made it easy to write a concise yet detailed prompt so I could generate what I wanted on the first try.

    • Solumbran@lemmy.world
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      12
      ·
      9 months ago

      A calculator calculates. An AI bullshits.

      The only thing ChatGPT can actually do might be marketing speeches, since they are nonsensical to start with and made by things pretending to be humans.

      • foo@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        It’s arguable that most of what we generate is mostly vague partially inaccurate bullshit.

        • Solumbran@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 months ago

          Unless you are a rock, your brain processes information to extract meaning from it. AIs don’t.