Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(59,439 posts)
Wed May 7, 2025, 01:00 PM May 2025

Everyone Is Cheating Their Way Through College (James D. Walsh, NYMag. Horrifying read on ChatGPT destroying education)

https://nymag.com/intelligencer/article/openai-chatgpt-ai-cheating-education-college-students-school.html
https://archive.ph/DzOE6 (Adding the Archive page because even though I've rarely looked at anything in New York magazine in recent months - and I haven't posted an OP about one of their articles since last September - I got a popup saying I'm past my monthly limit, so their detection of the number of visits is waaaay off.)

Everyone Is Cheating Their Way Through College
ChatGPT has unraveled the entire academic project.
By James D. Walsh

-snip-

In January 2023, just two months after OpenAI launched ChatGPT, a survey of 1,000 college students found that nearly 90 percent of them had used the chatbot to help with homework assignments. In its first year of existence, ChatGPT’s total monthly visits steadily increased month-over-month until June, when schools let out for the summer. (That wasn’t an anomaly: Traffic dipped again over the summer in 2024.) Professors and teaching assistants increasingly found themselves staring at essays filled with clunky, robotic phrasing that, though grammatically flawless, didn’t sound quite like a college student — or even a human. Two and a half years later, students at large state schools, the Ivies, liberal-arts schools in New England, universities abroad, professional schools, and community colleges are relying on AI to ease their way through every facet of their education. Generative-AI chatbots — ChatGPT but also Google’s Gemini, Anthropic’s Claude, Microsoft’s Copilot, and others — take their notes during class, devise their study guides and practice tests, summarize novels and textbooks, and brainstorm, outline, and draft their essays. STEM students are using AI to automate their research and data analyses and to sail through dense coding and debugging assignments. “College is just how well I can use ChatGPT at this point,” a student in Utah recently captioned a video of herself copy-and-pasting a chapter from her Genocide and Mass Atrocity textbook into ChatGPT.

-snip-

It isn’t as if cheating is new. But now, as one student put it, “the ceiling has been blown off.” Who could resist a tool that makes every assignment easier with seemingly no consequences? After spending the better part of the past two years grading AI-generated papers, Troy Jollimore, a poet, philosopher, and Cal State Chico ethics professor, has concerns. “Massive numbers of students are going to emerge from university with degrees, and into the workforce, who are essentially illiterate,” he said. “Both in the literal sense and in the sense of being historically illiterate and having no knowledge of their own culture, much less anyone else’s.” That future may arrive sooner than expected when you consider what a short window college really is. Already, roughly half of all undergrads have never experienced college without easy access to generative AI. “We’re talking about an entire generation of learning perhaps significantly undermined here,” said Green, the Santa Clara tech ethicist. “It’s short-circuiting the learning process, and it’s happening fast.”

-snip-

Still, while professors may think they are good at detecting AI-generated writing, studies have found they’re actually not. One, published in June 2024, used fake student profiles to slip 100 percent AI-generated work into professors’ grading piles at a U.K. university. The professors failed to flag 97 percent. It doesn’t help that since ChatGPT’s launch, AI’s capacity to write human-sounding essays has only gotten better. Which is why universities have enlisted AI detectors like Turnitin, which uses AI to recognize patterns in AI-generated text. After evaluating a block of text, detectors provide a percentage score that indicates the alleged likelihood it was AI-generated. Students talk about professors who are rumored to have certain thresholds (25 percent, say) above which an essay might be flagged as an honor-code violation. But I couldn’t find a single professor — at large state schools or small private schools, elite or otherwise — who admitted to enforcing such a policy. Most seemed resigned to the belief that AI detectors don’t work. It’s true that different AI detectors have vastly different success rates, and there is a lot of conflicting data. While some claim to have less than a one percent false-positive rate, studies have shown they trigger more false positives for essays written by neurodivergent students and students who speak English as a second language. Turnitin’s chief product officer, Annie Chechitelli, told me that the product is tuned to err on the side of caution, more inclined to trigger a false negative than a false positive so that teachers don’t wrongly accuse students of plagiarism. I fed Wendy’s essay through a free AI detector, ZeroGPT, and it came back as 11.74 AI-generated, which seemed low given that AI, at the very least, had generated her central arguments. I then fed a chunk of text from the Book of Genesis into ZeroGPT and it came back as 93.33 percent AI-generated.

-snip-

It’ll be years before we can fully account for what all of this is doing to students’ brains. Some early research shows that when students off-load cognitive duties onto chatbots, their capacity for memory, problem-solving, and creativity could suffer. Multiple studies published within the past year have linked AI usage with a deterioration in critical-thinking skills; one found the effect to be more pronounced in younger participants. In February, Microsoft and Carnegie Mellon University published a study that found a person’s confidence in generative AI correlates with reduced critical-thinking effort. The net effect seems, if not quite Wall-E, at least a dramatic reorganization of a person’s efforts and abilities, away from high-effort inquiry and fact-gathering and toward integration and verification. This is all especially unnerving if you add in the reality that AI is imperfect — it might rely on something that is factually inaccurate or just make something up entirely — with the ruinous effect social media has had on Gen Z’s ability to tell fact from fiction. The problem may be much larger than generative AI. The so-called Flynn effect refers to the consistent rise in IQ scores from generation to generation going back to at least the 1930s. That rise started to slow, and in some cases reverse, around 2006. “The greatest worry in these times of generative AI is not that it may compromise human creativity or intelligence,” Robert Sternberg, a psychology professor at Cornell University, told The Guardian, “but that it already has.”

-snip-



Much more at the link.
77 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Everyone Is Cheating Their Way Through College (James D. Walsh, NYMag. Horrifying read on ChatGPT destroying education) (Original Post) highplainsdem May 2025 OP
Bookmarking for reading later Demovictory9 May 2025 #1
For upper division courses, my wife... Happy Hoosier May 2025 #2
Good for her. And the teachers I've discussed this with know that oral exams (or a talk with the student highplainsdem May 2025 #20
I have long wondered why professors PoindexterOglethorpe May 2025 #3
Probably too time consuming for professors womanofthehills May 2025 #8
I think some do. My daughter has to submit her rough drafts Bristlecone May 2025 #45
I did that TimeToGo May 2025 #47
What makes you think they don't? Happy Hoosier May 2025 #72
Don't forget--this cheating will impact those who enter medical school, law school and others... hlthe2b May 2025 #4
I agree completely. highplainsdem May 2025 #41
Actually- the info on these sites including Groc is amazing womanofthehills May 2025 #5
I haven't used AI much, but when I was writing a paper EdmondDantes_ May 2025 #10
Sometimes AI educates you about an ignorance you didn't know you had. Lucky Luciano May 2025 #35
I remember you praising Grok before (it isn't Groc; never heard of an AI named Groc, but Twitter/X has Grok). highplainsdem May 2025 #12
As to using a chatbot for school assignments - Ms. Toad May 2025 #27
I've seen surveys of students indicating that most of them do consider the use of AI highplainsdem May 2025 #32
What the student thinks isn't relevant to whether it is cheating or not. Ms. Toad May 2025 #43
There's nothing truly artistic or creative in having an image generator spit out lots of options and highplainsdem May 2025 #50
You needn't be stunned. Ms. Toad May 2025 #53
I've played with image generators. I know how little control words give the AI user over the image created by highplainsdem May 2025 #60
AI can certainly be used without much creativity. Ms. Toad May 2025 #66
I don't disparage photography and never have. But I consider genAI unethical, antithetical to creativity, highplainsdem May 2025 #67
You are mixing arguments. Ms. Toad May 2025 #69
Photography would never have been considered art if, instead of capturing an image of what's in front of it, highplainsdem May 2025 #70
You are being very clear that your understanding of AI as part of the creative process is as simplistic, Ms. Toad May 2025 #74
A camera captures an image of something real in front of the camera. It captures and records highplainsdem May 2025 #75
If you've never tried to generate an artistic image via an AI prompt, you might want to give it a try LearnedHand May 2025 #59
I've used image generators. And I know that no matter what the prompt is, it not only doesn't provide highplainsdem May 2025 #61
I read a good reply to this article by Jacob T. Levy on bluesky senseandsensibility May 2025 #6
Yes. There've been recommendations of going back to handwritten exams using blue books since highplainsdem May 2025 #15
I have to wonder if some professors are resistant to this because senseandsensibility May 2025 #19
And some teachers think they should be allowed to use AI to grade their students. Sigh. highplainsdem May 2025 #21
It's not a matter of being allowed to use AI to grade their students - Ms. Toad May 2025 #36
My daughter just took finals in college on a blue book Pisces May 2025 #31
Glad to hear it. highplainsdem May 2025 #33
The old blue book method Stuckinthebush May 2025 #17
I think so senseandsensibility May 2025 #22
Unless you're Trump, then each of your ghostwriters and test-takers can have 20 pencils. JustABozoOnThisBus May 2025 #29
Not true - he said 5 pencils... Dan May 2025 #34
Lots of kids now can only print by hand, and slowly. highplainsdem May 2025 #24
Kick SheltieLover May 2025 #7
Ask ChatGPT what to do about that problem bucolic_frolic May 2025 #9
It isn't "theirs" and can never be theirs if they're just altering what a chatbot gave them. highplainsdem May 2025 #13
I just took a continuing JBTaurus83 May 2025 #11
Instructors like that are cheating their students. highplainsdem May 2025 #14
What's the value in that class? WhiskeyGrinder May 2025 #16
I agree JBTaurus83 May 2025 #37
The future of America is being made from the ignorance of so called instructors like this. live love laugh May 2025 #42
None whatsoever, but . . . HoneyAndLocusts May 2025 #62
I've heard that one tell for AI is AI likes to use dashes in their writing, like colleagues--especially & power--have FSogol May 2025 #71
Seems foolish to invest in college and not mzmolly May 2025 #18
Some kids believe only the degree is important, and not how you got it. highplainsdem May 2025 #25
No surprise here misanthrope May 2025 #23
I was a GTA about 25 years ago . . . hatrack May 2025 #30
By 25 years ago, TV was already taking.up more of kids' time than it had been 10-20 years earlier. highplainsdem May 2025 #38
some of the slide predates AI cab67 May 2025 #26
You sound like a real teacher, not just a placeholder standing in front of a class. Bravo. erronis May 2025 #28
vertebrate diversity, evolution, paleontology cab67 May 2025 #46
My natural OCD helped me glide through college Random Boomer May 2025 #40
I did that, too. badhair77 May 2025 #63
We need to go back to blue book exams in person. SidneyR May 2025 #39
Professors are using AI to grade papers. It's pathetic. nt SunSeeker May 2025 #44
As John Stossel said thirtyish years ago on 20//20 "Cheating is good" Clouds Passing May 2025 #48
I am a teacher. Balatro May 2025 #49
I used multiple choice questions and scantrons but I write the questions and badhair77 May 2025 #64
The worst part is, THEY DON'T CARE. LisaM May 2025 #51
Neil deGrasse Tyson said something I'll never forget ybbor May 2025 #52
there is the incentive to pile up degrees and credentials to get a jump in the hiring process Demovictory9 May 2025 #54
"I spend so much time on TikTok," she said. "Hours and hours, until my eyes start hurting, which makes it hard to plan Demovictory9 May 2025 #55
AI is going to take many of their jobs madville May 2025 #56
this is kind of hilarious: students in her Ethics and Tech class used AI to respond to "Briefly introduce yourself and Demovictory9 May 2025 #57
"Massive numbers of students are going to emerge from university with degrees, and into the workforce, who are essentia Demovictory9 May 2025 #58
It's so depressing that it has been happening so fast. I knew it would hurt education, but didn't expect highplainsdem May 2025 #65
i hated papers, but i almost always did fresh subjects, art school for sure. cannibalism in hs. habbit or acquired taste pansypoo53219 May 2025 #68
There are ways to adjust to the new environment ecstatic May 2025 #73
US Colleges are archaic so perhaps this will finally prompt them to update GreatGazoo May 2025 #76
I think we're rapidly heading to the point Diraven May 2025 #77

Happy Hoosier

(9,368 posts)
2. For upper division courses, my wife...
Wed May 7, 2025, 01:05 PM
May 2025

Has a paper conference with every student to discuss their paper. If they tried to AI their way through it, it is immediately obvious.

highplainsdem

(59,439 posts)
20. Good for her. And the teachers I've discussed this with know that oral exams (or a talk with the student
Wed May 7, 2025, 01:53 PM
May 2025

about a paper) and handwritten exams in class can pretty much weed out cheating via AI (though two former Columbia students caught cheating are working on creating what look like normal glasses but will read or hear questions and supply answers in a way they hope to make undetectable to others).

PoindexterOglethorpe

(28,390 posts)
3. I have long wondered why professors
Wed May 7, 2025, 01:07 PM
May 2025

don't require students to actually turn in their notes and rough draft of a paper. I seem to recall I was require to do that for one English paper.

womanofthehills

(10,678 posts)
8. Probably too time consuming for professors
Wed May 7, 2025, 01:17 PM
May 2025

All students are doing their research on internet & even Groc will tell you the source of info they give you.

Bristlecone

(10,979 posts)
45. I think some do. My daughter has to submit her rough drafts
Wed May 7, 2025, 03:33 PM
May 2025

Prior to paper submission. Everything has to include source citations, draft included. etc.

TimeToGo

(1,435 posts)
47. I did that
Wed May 7, 2025, 03:46 PM
May 2025

Plus, I tried to make assignments specific to questions we were working through. You could ask AI those questions, but it wouldn’t be able to work through the particulate nuance very well.

That said, I’m glad I retired last year. AI is just going to get sharper. I’m just not sure about the future of higher education.

Happy Hoosier

(9,368 posts)
72. What makes you think they don't?
Thu May 8, 2025, 09:31 AM
May 2025

Different classes do different things, of course. My wife is a college English Professor (literature) and she has students do a large project and requires them to turn in work product along the way.... source list, annotated bibliography, rough draft, final draft. She also has a "paper conference" before they turn in the final paper. Srudents learn the process of writing an academic paper, and the process means CHatGPT isn't all that useful. She has seen a recent decline in student writing skills, which may be COVID related, AI related, or both.

hlthe2b

(112,517 posts)
4. Don't forget--this cheating will impact those who enter medical school, law school and others...
Wed May 7, 2025, 01:09 PM
May 2025

Do you really want to trust your health, livelihood, or freedom to someone who learned zilch but may have been passed along thanks (largely) to AI? So, you think med students would get caught up during clinics and residency--well one would hope but I've seen my share of pure bullshitters that can manage to "fake it" sufficiently (until they cannot).

The former doesn't even begin to address the "brain drain" (already physically on the way to Europe and elsewhere thanks to Trump) that will be replaced by the "science" of those whose talents are limited to being able to use AI.

The future is not (even remotely) a bright one if we don't get this under control.

womanofthehills

(10,678 posts)
5. Actually- the info on these sites including Groc is amazing
Wed May 7, 2025, 01:12 PM
May 2025

Groc will even give you top 10 articles you can find on subject.

My boyfriend was reminiscing how he grew up with encyclopedias and now we can get answers of most questions in under a minute.

There is a fine line between cheating & researching now.

EdmondDantes_

(1,273 posts)
10. I haven't used AI much, but when I was writing a paper
Wed May 7, 2025, 01:25 PM
May 2025

And struggling to find citations, many of the citations were made up or at least didn't seem to exist via regular searching. Obviously none of them were used in my work, nor did I use the AI to write anything. But I think that's very much the nature of LLM's in that it's at least partially a fancy text predictor. So for things like coding, where there's generally a preferred way to do something, it works to provide a reasonably close output. But they suck at saying no results found or telling you how confident the answer is, so everything is spewed out the same.

But even for coding which is a best case for an LLM, you're still shortcutting your knowledge process which includes failing, which includes finding things you didn't know that might work for another problem or another implementation. Sure in theory it might foster interest in learning more in some people, but most people will take the easier short term path of completing the assignment.

Lucky Luciano

(11,810 posts)
35. Sometimes AI educates you about an ignorance you didn't know you had.
Wed May 7, 2025, 02:51 PM
May 2025

I had to make some matrix computations in Python which is a very slow language unless you use numpy, a library that is very popular with quants because it is calling very fast C functions. I used numpy a lot, but I could not figure out how to make one million large quadratic form calculations blazing fast until I asked ChatGPT and I learned about numpy’s einsum (Einstein summation) which was 10000 times faster than looping through individual matrix mults which are fast…but not like this…this blew my mind that this existed and has been key to my work since then!

highplainsdem

(59,439 posts)
12. I remember you praising Grok before (it isn't Groc; never heard of an AI named Groc, but Twitter/X has Grok).
Wed May 7, 2025, 01:26 PM
May 2025

I pointed out there that Grok, like other chatbots, makes plenty of mistakes.

Link to my April 26 reply to you, which includes links to two DU threads on how badly Grok screws up:

https://www.democraticunderground.com/100220270781#post15

You don't learn anything about research - and learn very little, period - when you ask a chatbot for answers.

And using a chatbot for school assignments is clearly cheating, unless the teacher has specifically authorized it.

Ms. Toad

(38,062 posts)
27. As to using a chatbot for school assignments -
Wed May 7, 2025, 02:23 PM
May 2025

it isn't inherently cheating unless specifically authorized.

Any course syllabus, these days, should address the use of generative AI. But not every teacher/professor updates their syllabi. So, in the absence of express authorization or prohibition, the question of whether it is cheating will depend on the wording of the syllabi and/or the school's code of conduct. But using AI is no more inherently cheating than the use of calculators was in the mid- to late-70s, or the use of spell/grammar-check was a decade later. When new technology is introduced, those in education should adapt their syllabi to address its use so that there are no misunderstandings about what is permitted. And, if there is any question, students should ask. But a blanket statement that it is cheating unless specifically authorized isn't, in my experience as both educator and student, not accurate.

I used AI (trained exclusively on licensed images) to generate a base image that I used in a recent art project. The syllabus for the course was silent on the use of AI, as was the school's code of conduct. As long as I did not claim the image I generated as my own, using AI as a component of the project was consistent with the both the school's code of conduct and the course syllabus.

That said - I followed my own advice and expressly obtain permission to use it. As a result of the conversations we had around how I intended to use AI, the instructor has asked my permission to use the piece for future classes as an example of good use of AI as a tool, and will likely change the syllabus to explicitly address it.

highplainsdem

(59,439 posts)
32. I've seen surveys of students indicating that most of them do consider the use of AI
Wed May 7, 2025, 02:42 PM
May 2025

to be cheating. It's cheating IMO even if teachers are misguided enough to authorize its use.

What's generated by AI is not the work of the AI user, any more than it would be if they hired someone to do the work for them.

If you're being tested for basic math skills, use of a calculator is cheating.

If you're being tested for spelling, use of a spellchecker is cheating.

GenAI tools generate text, images, video and music - all of which result from many separate decisions and skills when humans do them. Having AI do any of those for you, substituting what it can generate as a plagiarism machine - and mindlessly giving users options, since any prompt can keep the AI offering countless options - is in no way comparable to using a calculator or spellchecker.

Ms. Toad

(38,062 posts)
43. What the student thinks isn't relevant to whether it is cheating or not.
Wed May 7, 2025, 03:27 PM
May 2025

It is what the instructor/school think - and, more important, what the guidelines say. The teachers and schools always set the rules, not the student.

Getting the right numerical answer is part of what is being tested in any math class - whether it is basic math skills or abstract algebra. Whether using a calculator is cheating depends on the instructor. As someone who taught math for 11 years (including basic math), and has a two degrees in math, I am qualified to assess that. When I started allowing the use of calculators in my basic math classes in the late 70s, many other math teachers forbid them. Then my students started significantly outperforming theirs on the standard test which all basic math students were required to pass - even when they had to take the class without using calculators. That is because the skills I taught them about process (which calculations were needed) and estimation freed them from the mechanics of arithmetic to learn the skills they needed to survive in real life. They used those skills on the exams to figure out when to use which functions - and to eliminate wrong answers because they were nowhere near the estimated answer - because they weren't struggling with arithmetic. (These were students who, for the most part, failed the 9th grade after passing every other grade without learning anything because the system I taught in could not fail a student without parental permission before 9th grade. They struggled with everything but the simplest arithmetic test - so we were working on basic skills like being able to calculate whether they were being paid what they were owed when they worked 35 hours for 8.75 an hour, or figuring out the unit price of items in the stores.)

Same for spelling. As a law instructor - grammar and spelling were always part of the grade, so spelling is always being tested - even at the graduate school level. No one is expected to turn off spellcheck before they complete their papers.

And as for creatives using AI - the fact that multiple options are generated makes it even a better analogy to calculators and spellchecking. The first (and repeated) step in art is ideation: the generation of lots of (in visual arts) rough sketches for a project - and then evaluating those images for composition, among other things. So generating multiple images is precisely what allow the artist to apply their creative eye to select those which are artistic from those which are garbage - in the same way they would from their own sketches. They are just creating sketches using a verbal description rather than their hands. This is analogous to the process of using a calculator to create a potential answer, then using estimation skills to ensure that it is not wildly inaccurate, or to the process of spellchecking in which multiple potential words are provided and the author must use their understanding of the intended meaning to choose the correct one.

While AI can be used without any thought (just as calculators and spellcheck can be), they are (all three) powerful tools which someone skilled in mathematics, writing, or art can use to aid their productivity.

(Again - I am not in favor of using AI which was trained on stolen works. That is a significant legal and moral issue - but a separate one from whether their are good uses for generative AI.)

highplainsdem

(59,439 posts)
50. There's nothing truly artistic or creative in having an image generator spit out lots of options and
Wed May 7, 2025, 04:08 PM
May 2025

then choosing one. The AI user isn't creating the image via prompts any more than someone shopping online for a clothing item using keywords created the item they finally chose.

So generating multiple images is precisely what allow the artist to apply their creative eye to select those which are artistic from those which are garbage - in the same way they would from their own sketches.


Stunned that you would compare choosing from images the AI user didn't create to an artist creating original sketches, using their own skills to start work on something they will finish.

Art comes from inspiration, skills, knowledge and intent. AI art is mindless plagiarism. Its mindlessness is why AI users have to be selective. There's more real artistry in a human artist starting with just a few brush strokes or pencil lines than in any number of AI images.

And I'm not aware of any AI image generator that was entirely ethically trained, unless one was trained only on images that were bought or licensed specifically for use as AI training data. There are a lot of very unhappy photographers and artists who discovered that platforms and tools they were using, which had nothing to do with AI originally, had owners who decided the terms of service could be reinterpreted to use those photos and images to train AI to create new images to compete with the photographers' and artists' work. I consider what was done to those creatives a ripoff.

As I recall, too, Adobe, despite early boasting about ethical training of its AI, used countless AI-generated images from Midjourney - infamous for how much intellectual property it stole - to train its Firefly AI tool.

Ms. Toad

(38,062 posts)
53. You needn't be stunned.
Wed May 7, 2025, 04:27 PM
May 2025
Stunned that you would compare choosing from images the AI user didn't create to an artist creating original sketches, using their own skills to start work on something they will finish.


First, the AI user did create the sketches - just with words, rather than pen and paper.

Second - have you heard of aphantasia? Most people, including artists have a "mind's eye," which allows them to visualize things. Close your eyes and try to picture an apple. If you can see the apple, you have the main neurological variant of a mind's eye. Some of us, including a fair number of artists, see absolutely no images. So when I am in the ideation phase of creating art, words are a lot closer to what I "see" than images - so sketching those original thoughts with a tool that can turn words into images is similar to the process of sketching with pen and paper for people without aphantasia.

It is still not an efficient process for me - because what I "see" is still closer in form and function to vectors than to words. So most of my ideation goes undocumented, or documented by only a few words.

But AI can be a useful tool equivalent to quickly cranking out sketches for those who see pictures in their minds.

Third - you don't think writing is a creative process? Putting your vision down in words (as you do to create a prompt) certainly, in my book, qualifies as creative.

Fourth identifying which image matches your vision and evaluating it for quality requires an artists' mind. Identifying and editing the portions of the image which don't match your vision requires creativity. This is a different creative process - but creating photography requires a different creative process than painting, than sculpting. No one creative process is inherently better than another.

Finally - the arguments you are making are extremely reminiscent of the arguments used to dismiss photography as not real art.

highplainsdem

(59,439 posts)
60. I've played with image generators. I know how little control words give the AI user over the image created by
Wed May 7, 2025, 05:41 PM
May 2025

the AI. It isn't unusual to see some specified elements left out, or even if they're included, placed completely wrong.

And even if every element is included, the exact same prompt - without a single word changed - can create an almost endless number of variations, starting with the original four options the user is offered (or whatever the number is with that image generator). Options that can be wildly different. Don't like any of those? It'll give you another four. And again. And again.

Prompts are not truly creative because what you're giving the AI are in effect keywords to assemble something. They're much more comparable to search keywords. But the AI responds to those keywords differently with each image generated from the same prompt.

Are prompts slightly creative? In the sense that any idea summarized in a few words might be considered creative. Are they artistry? No. They're triggers to get the AI to dredge up something from what's in its training data, which consists of other people's artistry plus the labels associated with it from descriptions of it. But the AI user didn't create any of it.

If you paid someone to do a painting and just gave them the prompt you gave an AI, would you think that you were being very creative? Would you think you had created the painting they did?

You have not created the AI images you have the AI spit out possible variations of, just as people using text generators haven't written what it produces, and people using music generators haven't created that music.

What AI tools do is fake creativity.

You could use a Latin name of an animal, a name you don't recognize, as a prompt, with absolutely no idea what the image will look like. And if that Latin name and the image are in the AI's training data, you might even get a correct image. With zero knowledge of what you're supposed to get.

Ms. Toad

(38,062 posts)
66. AI can certainly be used without much creativity.
Wed May 7, 2025, 09:39 PM
May 2025

I never claimed it was always used creatively.

You, on the other hand, are claiming it is NEVER creative. I have described a way in which it can be used creatively as a substitute for the sketching process of ideation.

When you claim an absolute, as you have, a single example - such as the one I have given - disproves. it.

As for your specifics - you seem to be deliberately trivializing the process of using AI by focusing solely on the output of a single prompt, and minimizing how it can be used as a tool in the creative process.

As I noted earlier, generating dozens of sketches is a standard part of a typical part of the ideation process in beginning an art project - but for some of us the process of creating sketches by drawing by hand is not a simple dump of the image in our mind to the paper, because there is no image in our minds. It is a time consuming process which makes it nearly useless for the purposes for which it is intended at that stage of development. AI is a tool which can be used to quickly generate a concrete examples of the ideation. It is the ideation that is the creation - including reducing that idea to a a rough visual, whether by sketching or describing it textually (with or without using AI). Selecting images, from among the many generated which (1) are a close match to what was in your mind and (2) are artistically composed - or even which prompt ideas for different compositions are essential elements of the creative process - which draws inspiration from a number of sources.

If the images generated omits parts of the prompt that the artist wanted included, those can be painted in in later iterations or may be included (or not) in a final art piece which may be rendered entirely in the artist's hand. Or, by viewing the generated options, the artist may discover that the composition is more pleasing without those elements - again that requires a human creative assessment, whether those sketches are rendered by the artist's hand or by the artist using a tool (AI - or a camera).

Giving an artist a prompt and paying them to create a painting is more analogous to one of the many ways that generative AI can be used with virtually no creativity. But my point is that AI can be used in ways that ARE creative - whereas your position is that using AI in art is never creative.

There is nothing new in art - and re-envisioning or appropriation (of your own work or that of others) in a different media or using different tools is an essential part of the foundational skills taught in art school and used by the best artists. For example, dozens of well respected artists (including Picasso) have appropriated Manet's "Luncheon in the Grass."

Aside from anything I personally believe, I am currently in a NASAD accredited art school, roughly 1/3 of the way to earning a BFA. There are classes in which I am strongly encouraged - and some in which I will be required - to use AI. A session at NASAD's upcoming annual meeting focuses on the uses of AI in art education:

With an emphasis on creativity and self-expression, this session will examine strategies for using Artificial Intelligence (AI) in studio art and design courses. Aware of concerns that the use of AI might diminish student performance, purposeful strategic decisions pertaining to pedagogy can position the artist/designer to embrace AI as a productive tool employed to enhance creative processes. Attendees will explore issues such as course requirements and associated expectations regarding outcomes that may assist them to design and implement approaches that incorporate applications of AI in the studio.


https://nasad.arts-accredit.org/sessions-and-events/

As for the assessments of others in the arts:

National Arts Education Assessment:

AI can be a useful resource for visual arts educators and their students, augmenting their teaching methods and encouraging student experimentation, including, but not limited to the following::

AI software can assist students in generating initial concepts, ideation, enhancing digital design skills, and experimenting with different artistic elements.
AI can offer students a platform to explore their creativity, fostering innovation and critical thinking skills.
AI can support teachers in lesson and material development, and support differentiating curriculum through providing translations, closed captioning, and other resources.
AI can be a powerful tool to assist all students of varying abilities.


You will note that the first point is essentially what I described as a substitute for sketching. (And I had not previously read this document)

From the Arts Education Partnership quarterly meeting:

Another comparison was made between AI art and photography. At the time of its invention, photography was a revolutionary and unknown medium that was discounted for its seeming lack of artistry. However, photography is a medium that requires practiced technical skill from the artist to capture beautiful and thought-provoking images. AI will change the art sphere and it will take time for others to learn how to use the technology effectively. However, it is a tool, just like photography and remixing.


Another connection I often mention - the initial disparaging of photography as devoid of any creative merit, to the acceptance that it is indeed art.

I am not making the argument that ALL uses of AI in artistic endeavors is creative or artistic. Simply that some are, and there is a place for its use in art education and beyond.

highplainsdem

(59,439 posts)
67. I don't disparage photography and never have. But I consider genAI unethical, antithetical to creativity,
Wed May 7, 2025, 10:55 PM
May 2025

crippling to those using it, and harmful to the livelihoods of real artists.

It is not a tool like photography. It's a plagiarism machine that those using it have little control over. Just selecting from the infinite number of images it can vomit out is not creating art.

And all the horrendous AI slop polluting the internet is much more representative of what's created by AI than very painstaking attempts to make it into any kind of worthwhile art. Because all the effort in the world does not make the user the creator of the images it spits out mindlessly.

I know entirely too many artists who despise AI to care about academic justifications of it. Those justifications sound a lot like the arguments for other uses of AI like ChatGPT in education.

The paragraph below, at the https://www.arteducators.org/advocacy-policy/articles/1303-naea-position-statement-on-use-of-artificial-intelligence-ai-and-ai-generated-imagery-in-visual-arts-education link you provided, is nonsense.

Since the use of AI has directly challenged the roles of artists and designers in society, a quality art media arts, and design education program should highlight the technical skill and use of formal qualities for producing conceptually rich content that trained artists and designers provide. AI should be used responsibly and ethically to generate imagery derived from public domain or creative commons licensing, rather than copyrighted works. Educators and students must understand that image generation without proper attribution is a breach of academic integrity akin to plagiarism. The emphasis should be on fostering creativity and innovation while respecting the intellectual property rights of creators and artists. This balanced approach ensures that the use of AI in image generation is both legally compliant and morally respectful of the rights and efforts of original content creators. By approaching AI-generated images with caution and thoughtful guidance, art educators can harness its benefits while preserving the unique aspects of human creativity.


Image generators have NOT been used that way - with respect for artists' rights - and IMO it's absolutely grotesque for whoever wrote that to pretend that they have.

What image generators are you using, that you consider truly ethical? I'm not aware of any that are.

And there's no way to pretend that highly rated and popular image generators like Midjourney and OpenAI's DALL-E 3 aren't based on theft of intellectual property. PC Mag rated Midjourney at the top of the best image generators earlier this year, but admitted "it appears that Midjourney does not care about intellectual property violation.". No kidding. None of them do.

IF whoever wrote that ludicrous paragraph at that link TRULY respected the IP rights of artists, they should have identified whatever image generator they recommend that they consider ethical, and named and shamed every popular image generator built on outright theft. I suspect they didn't because they know very well that almost everyone generating AI art is using an unethical tool, and they don't want to discourage those AI art enthusiasts. So they offer a completely illusory image of ethical AI art that simply doesn't exist, instead of taking an ethical stand against all those unethical but popular tools.

As offensive as the AI bros saying they want to get rid of intellectual property laws are - those poor dears are tired of being sued over their theft of the world's culture - NAEA's pretense that AI image generators are currently ethical and respectful of artists' IP rights is nearly as bad. Because it's a coverup of what's really going on.

Ms. Toad

(38,062 posts)
69. You are mixing arguments.
Wed May 7, 2025, 11:28 PM
May 2025

Whether AI can be a part of the creative process is an entirely separate question from how to use AI ethically.

We agree as to the ethical concerns.

But, as I said, the arguments you are making against using AI as part of the creative process are - almost word for word - the arguments those disparaging photography as not real art have used against it. It is completely irrelevant that you have always considered photography art. You are still, likely without being consciously aware of it, mimicking the arguments levied against photography, initially, and against the use of photoshop for as an electronic darkroom, later. Since I have lived through those arguments as to photography twice, I know those arguments well, and know how they turned out in the end. (As well as similar arguments against the use of calculators.)

With the introduction of every new technology, there is the potential for benefits and detriments. Simplistic, knee-jerk rejections of new technology, rather than figuring out how to harness it for good is not a winning strategy.

highplainsdem

(59,439 posts)
70. Photography would never have been considered art if, instead of capturing an image of what's in front of it,
Thu May 8, 2025, 07:12 AM
May 2025

the camera created an image of something that WASN'T THERE, after the photographer said a few magic words as a prompt.

How much clearer can I make it than by pointing out that an AI image generator will spit out images of things that AI user knows absolutely nothing about and has never even heard a description of, if those words are in the prompt? The user hasn't magically acquired the knowledge of what that thing looks like plus the ability to create an image of it.

Whether AI can be a part of the creative process is an entirely separate question from how to use AI ethically.


No, it isn't, and it can never be. Because the tools are fundamentally unethical and work only because of the enormous amount of images in the training data - images that were either stolen outright or misappropriated by companies reinterpreting their TOS to discover - oh, happy day! - that what artists and photographers had never imagined being used to train AI to compete with them actually DOES allow that.

I'm still curious about which AI generators your teachers believe it's just fine to use, because even if they, like that paragraph I quoted, pay lip service to the necessity of respecting artists and intellectual property rights, lip service is not enough.

This is an existential fight for real artists.

AI companies are making it clearer that they don't feel they should have to repect IP rights. What's happening in the UK is particularly instructive.

https://www.democraticunderground.com/100220219693

Ms. Toad

(38,062 posts)
74. You are being very clear that your understanding of AI as part of the creative process is as simplistic,
Thu May 8, 2025, 10:55 AM
May 2025

as was the understanding of the general population (and the art world) of photographers when photography was a newer technology.

In fact, you have essentially stated the primary argument made against cameras as part of the artistic process, without recognizing it as the same essential argument you are making about AI. The argument was that all you are doing with a camera is pointing it as something and clicking a button and the tool does the rest - like all the artist using AI as part of their creative process merely types a prompt and the tool does the rest.

The argument is not true in either case.

As far as whether creativity can be answered separately from whether the use of a particular tool is ethical, the two questions are independent of each other. Something can be creative and also unethical. Creativity is the process of using your imagination. Unethical requires the violation of social, moral, or legal norms. You can certainly use your imagination in ways that violate social, moral, or legal norms. For example, there are some very imaginative scams that are illegal.

highplainsdem

(59,439 posts)
75. A camera captures an image of something real in front of the camera. It captures and records
Thu May 8, 2025, 07:56 PM
May 2025

reflected light via a light-sensitive medium.

Image generators create images that are mashups from stolen art and photos. Zero talent or skill required from the AI user. As I said, you can get an image generator to spit out a picture of something even if you have absolutely no knowledge of what you're requesting. Not an inkling of what it should look like. But if the right images and words are in its training data, the AI could probably dredge it up, though it might take it a lot of tries. Tries made in seconds, with the AI generating different images every time. It could generate thousands of images fairly quickly. The user could sort through all those images to find one they like, but that isn't creating art - it's shopping for an image to show off - and since the user has absolutely no idea what they're looking for, they might decide an image of something else looks like it could be right.

This. Is. Not. Creating. Art.

You could ask a young child to type in a prompt asking for a picture of a flying pachyderm hovering over an Eames chair in a midcentury modern living room, with the child having no idea what a pachyderm or Eames chair or midcentury modern should look like (and you'd probably have to spell out certain words for them). And if those words and images are in the AI's training data, you might get those elements in a picture. Given the amount and type of intellectual property stolen for the training, the flying pachyderm will probably look like Dumbo, and Dumbo might be on or even under the chair, or the Eames chair might be floating, with or without wings, because that's also the sort of thing these mindless plagiarism machines can do.

But no matter what the results are, the child wouldn't have created the images churned out, and neither would you have. You've shopped for images with keywords.

And no, you cannot separate the fact that the training data was stolen or misappropriated from the images you like to have the AI generate for you. Any more than you could or should separate a meal being prepared by slave labor from a discussion of the cuisine, even if you're particularly happy with the cuisine because you provided your host with the menu and recipes.

Those are real artists and photographers whose work was stolen.

Midjourney has for years been considered one of the best image generators.

It's also been infamous for its images including complete or partial watermarks from all the photos and other images the company stole. And the typical user of these AI tools just views that sort of thing as a nuisance to be edited out.

As I've said here a number of times, if people are forced to use unethical generative AI for work or school, I can sort of understand them trying to set the ethical considerations aside. But that does not make the tools ethical.

The AI bros are hopIng to get the laws changed to carve out exemptions for training the AI they hope to make millions, even trillions, from. That won't make the tools more ethical, either.

LearnedHand

(5,214 posts)
59. If you've never tried to generate an artistic image via an AI prompt, you might want to give it a try
Wed May 7, 2025, 05:09 PM
May 2025

Especially before passing judgment on someone's work with the tool. If just any old person tells the AI to draw a cat and a dog, you get exactly the slop you expect. Prompting the AI is THE main artistic achievement, and I have seen some absolute wizardry with AI image generators. It is, after all, just another tool. Or by this reasoning, should we also tell magazines and newspapers to stop using digital layout and typesetting?

highplainsdem

(59,439 posts)
61. I've used image generators. And I know that no matter what the prompt is, it not only doesn't provide
Wed May 7, 2025, 05:55 PM
May 2025

one one-thousandth the control a real artist has, but it can produce a different image - sometimes wildly different - every time it's used, without a single word of the prompt being changed.

A prompt that just happens to create an image you like at one time is likely to create something you dislike at another. There is no real artistic control. The AI user has a rough idea of what they want, and if they like the result they want credit for it and think the prompt is genius, even if that popped up with three other image options that they hated.

It's like shopping online with keywords, and finding something you like. You didn't create it.

senseandsensibility

(24,185 posts)
6. I read a good reply to this article by Jacob T. Levy on bluesky
Wed May 7, 2025, 01:14 PM
May 2025

He is a college professor who wrote: "I got the best teaching evals of my career for a large course last semester--along with three teaching awards-- after rebuilding my intro class around in-class handwritten essays. I simply do not believe that this is unsolvable or that students don't care about actually learning to do their work.'

This rings true to me. He outlines a simple solution that was standard operating procedure when I was in college in the 80's and can be again. Heck, until a couple of years ago when I retired, I had my second graders write essays in paragraph form on tests in "their best printing". The paragraph formal is part of the curriculum of second grade, complete with topic sentence and between three to five supporting sentences. This forced them to organize their thoughts and stick to the topic and really enforced the concept. So, basically what I am saying is that if second graders can do it.....

highplainsdem

(59,439 posts)
15. Yes. There've been recommendations of going back to handwritten exams using blue books since
Wed May 7, 2025, 01:43 PM
May 2025

the first months, maybe weeks, after ChatGPT was released.

senseandsensibility

(24,185 posts)
19. I have to wonder if some professors are resistant to this because
Wed May 7, 2025, 01:52 PM
May 2025

of the time it takes to read and evaluate long answers. It is mind numbingly time consuming as I recall very well when I taught middle school English and had almost 200 students each day. I enjoyed it to a certain extent, but WOW...many week-ends and evenings lost to that "activity".

Ms. Toad

(38,062 posts)
36. It's not a matter of being allowed to use AI to grade their students -
Wed May 7, 2025, 02:56 PM
May 2025

As a teacher, you can pretty much use whatever tools you want to grade students' work - and I know some who are using AI as a way of doing a first cut. Their contention is that it frees them to focus on more important aspects of the work that AI is not yet good at (picking out made-up crap, appreciating the flow of an argument among essays which each received the same number of points for simply mentioning required elements, etc.)

I, personally, don't think it is sophisticated enough to be a useful tool yet, and I'm not aware of any LLM AI which is exclusively trained on licensed content so there are ethical issues. So I wouldn't use it.

But some instructors are already using it for grading. And - given the increasing work load (increases in class size/decreases in number of faculty) - anything which allows instructors to focus their time on things which AI can't do, or can't do well, isn't inherently bad. I have always spent 80-100 hours a week for the 20 years I taught (11 in high school, and 9+ in law school). As a salaried employee I was never paid for the 40-60 hours I was donating. I automated as much of my gracing work as I could so that I could spend my time on things that were formative, rather than evaluative. Most teachers I know resorted to multiple choice tests (which is the fastest - but least accurate way of evaluating knowledge, with no formative feedback), fewer evaluations, etc. simply to get by without putting in the time required to complete the actual workload.

As a student (which I am currently), I would rather have a teacher who figured out how to spend more of their time creating meaningful lessons and being available to students than grading. And as class sizes increase, the ratio of grading to teaching also increases. Something has to give.

Stuckinthebush

(11,191 posts)
17. The old blue book method
Wed May 7, 2025, 01:46 PM
May 2025

We used a booklet on tests in the early 80s in college. The "Blue Book" had enough pages for a long essay. I wonder if they still make those. Of course, students will have to write using a pen or pencil. Do they still use those?

JustABozoOnThisBus

(24,554 posts)
29. Unless you're Trump, then each of your ghostwriters and test-takers can have 20 pencils.
Wed May 7, 2025, 02:37 PM
May 2025

It's easy to fill a blue-book when you're writing with Magic Markers.

bucolic_frolic

(53,678 posts)
9. Ask ChatGPT what to do about that problem
Wed May 7, 2025, 01:20 PM
May 2025

Students who are very good writers could take the work of AI and edit it for content, readability, colloquialism, believability and make it theirs.

But the vast majority of students are not very good at using the English language in written form.

I don't question so much AI damaging their memory. Problem-solving and creativity are another matter.

Common sense was not taught a lot in my courses. Bias was not worried about. You learned to agree with the professor to survive.

Society is such a pyramid now. Does education matter? Isn't it all about skills, computers, liquidity, and who you know?

On edit, addendum:
+++++++++++++

I just say through an hour's presentation on using AI for grant applications. Let me be clear. The most gifted writers and best thinkers and business executioners will have NO advantage in their application process because AI elevates the most mediocre hacks to the same elevated level. We are rewarding less qualified people and penalizing talented ones. This society will collapse over the long haul with this strategy.

highplainsdem

(59,439 posts)
13. It isn't "theirs" and can never be theirs if they're just altering what a chatbot gave them.
Wed May 7, 2025, 01:37 PM
May 2025

And whether or not you question AI damaging their memory, it does, simply because they spent so little time mentally engaged in subjects they're supposed to be studying. They end up ignorant of what they were supposed to have learned, and crippled in terms of problem-solving and creativity.

In effect they've deprived themselves of an education that might've cost them a lot.

And this cheating is depriving society of the educated adults it needs - especially for a democracy.

JBTaurus83

(826 posts)
11. I just took a continuing
Wed May 7, 2025, 01:26 PM
May 2025

education course at a community college, and the instructor actually encouraged students to do this and then just "tweak" the output.

62. None whatsoever, but . . .
Wed May 7, 2025, 06:15 PM
May 2025

Last edited Thu May 8, 2025, 09:09 AM - Edit history (1)

Higher ed instructor here.

No question, an instructor who tells students to "write" essays by tweaking AI-generated output has abdicated a core duty of their role. I tell my students that everything they write must come directly from their own brain, and this is the hill that I will die on. However, I understand why some professors choose surrender. In a few short years, students have become alarmingly reliant on ChatGPT (or as they call it, "Chat''). No matter how sternly you threaten or how sweetly you cajole, no matter how cleverly you design the assignment, if you let students do work out of class, a good percentage of them will turn in AI-generated garbage. I'm at a relatively prestigious institution, and if I ask students to do out-of-class writing (which I've discontinued almost entirely), at least a third of the work I receive is either unambiguously AI-generated or extremely suspect. And I'm sure some of the AI-generated material that I encounter flies under my radar entirely. If they copy-paste unedited ChatGPT output, it's glaringly obvious, but most of them are smart enough to hide their tracks. Horrifyingly, some of them probably even believe that they actually are doing the work when they fill in the blanks of an AI-generated outline.

At a community college, where many students arrive academically underprepared and juggle education with work, I wouldn't be surprised if a large majority of students used AI. Practically speaking, you can't report the majority of your students each semester for academic dishonesty: you will swiftly find yourself out of a job. So unless you are prepared to eliminate all out-of-class assignments, when you prohibit AI, you're making a rule that you have no way of enforcing, and then you have to grit your teeth all semester long as students flout that rule.

Another factor to consider: to my endless consternation, some institutions either encourage or outright require instructors to permit AI use. My own institution grants instructors a reasonable degree of latitude to set our own course policies, but there's definitely some worrisome pro-AI rhetoric coming from the administration; reading between the lines, they think it's the inevitable future and we have no choice but to get on board. My personal view is that climbing on board the AI train is like getting in the van with the kidnapper: if you don't fight like hell right from the start, you're losing your best shot at getting out alive. But alas, not all of my colleagues--especially the ones who have installed themselves in positions of institutional power--have the appetite for this fight.

FSogol

(47,505 posts)
71. I've heard that one tell for AI is AI likes to use dashes in their writing, like colleagues--especially & power--have
Thu May 8, 2025, 07:13 AM
May 2025

Welcome to DU!

highplainsdem

(59,439 posts)
25. Some kids believe only the degree is important, and not how you got it.
Wed May 7, 2025, 02:01 PM
May 2025

And that's how you end up with supposedly college-educated dunces liks Trump. Who reportedly paid others to help him through.school.

Cheating is far from new. ChatGPT just made it almost universally available, and free.

misanthrope

(9,337 posts)
23. No surprise here
Wed May 7, 2025, 01:56 PM
May 2025

I noticed 40 years ago that most of the college students I met weren't really interested in learning. They sought a degree simply for the access to wealth it provided.

Most Americans don't value intellect or education. They just want money.

hatrack

(64,105 posts)
30. I was a GTA about 25 years ago . . .
Wed May 7, 2025, 02:38 PM
May 2025

Anyway, what struck me at the time was how freaking terrible university students were at writing. Problems with they're/their/there, too/to/two, subject freaking verb agreement, and these were juniors and seniors.

This wasn't something incredibly complex, it was a survey course - Eastern Civilization.

These weren't tests, either - these were term papers, for which they had plenty of time to prepare and research. Roughly 20% were good, solid undergraduate writers, another 20% were getting there, and the remaining 60% marched with the I Suck At Writing Brigade.

highplainsdem

(59,439 posts)
38. By 25 years ago, TV was already taking.up more of kids' time than it had been 10-20 years earlier.
Wed May 7, 2025, 03:15 PM
May 2025

Same with video games.

And the fairly.sloppy writing those kids often.saw online would have contributed to their poor writing.

Once upon a time, almost everything people read would have been written and proofread and edited by professionals. Not any more.

I vaguely remember seeing someone complain online, at least 30 years ago, that the ability to format thoughts in paragraphs seemed to be disappearing - online, at least.

I've heard recently that a lot of kids entering college have never been assigned an entire book to read. Only excerpts, all through school, till college.

cab67

(3,568 posts)
26. some of the slide predates AI
Wed May 7, 2025, 02:18 PM
May 2025

Certain aspects of student effort have been in decline for years. Note-taking is a big one. Many students seem to assume that I'll provide my powerpoint slides. And other than the pictures I use, I don't.

They also assume that I provide study guides. Which I don't. And then they complain that they don't know what to study for.

If it's important enough that I wrote it down in class, I respond, it's probably a good idea to put it in your notes. That's what you should be studying.

I don't even use a textbook - too expensive and way too out of date for my subjects. That should, you'd think, streamline things - it's all in lecture.

erronis

(22,388 posts)
28. You sound like a real teacher, not just a placeholder standing in front of a class. Bravo.
Wed May 7, 2025, 02:29 PM
May 2025

I like your sentiments on note-taking and no assigned text-books.

Curious. What discipline/courses do you teach?

Random Boomer

(4,376 posts)
40. My natural OCD helped me glide through college
Wed May 7, 2025, 03:16 PM
May 2025

I took notes for every college class in my "rough notes" pad. They were hurried, written in fragments and messy -- and I just couldn't help myself, I HAD to re-write them neatly in another notebook. So usually within a few hours of the lecture, I'd copy over my notes, but turn the fragments into sentences and if I was missing something that was said, I'd look for it in the class textbook or other reference. For science class I'd often add little illustrations of what the professor had drawn or displayed.

I'd review my notes before a test, but I didn't sweat bullets studying, and then I sailed through the class with As or occasional Bs for really tough courses that didn't interest me (I'm looking at you, astronomy).

I had no idea that this was probably THE most effective way to learn, I just couldn't stand messy notes. They offended my aesthetic sensibilities.

badhair77

(5,072 posts)
63. I did that, too.
Wed May 7, 2025, 07:03 PM
May 2025

I loved when the class was set up as a lecture class and I had to take notes. Each evening I would type my handwritten notes and that was an instant review.

SidneyR

(203 posts)
39. We need to go back to blue book exams in person.
Wed May 7, 2025, 03:15 PM
May 2025

No graded work composed outside class. You show up, turn in your phone, and answer questions and write essays on the spot, in the classroom.

 

Balatro

(51 posts)
49. I am a teacher.
Wed May 7, 2025, 03:58 PM
May 2025

I can tell you first hand, the arms race never ends. The admins recently ran a PD on using AI and some of the "old guard" was acting like someone was performing a magic trick before their eyes. People do use AI, which is a big reason why many teachers have reverted to low-tech options like scantrons and written tests. MCQs and computer writing assignments, you have to almost always assume that AI was used.

badhair77

(5,072 posts)
64. I used multiple choice questions and scantrons but I write the questions and
Wed May 7, 2025, 07:10 PM
May 2025

they required some thought. I also included an essay question with each test, written in class. It was a tough battle but I tried to fight the cheating. I even gave different tests within the same class so they couldn’t help each other. I’m so glad I’m retired.

LisaM

(29,461 posts)
51. The worst part is, THEY DON'T CARE.
Wed May 7, 2025, 04:14 PM
May 2025

I can't tell you how many people I know see a college degree as something to monetize, or a way to get a job.

I never saw my English degree as something that would get me a higher paying job or even as a way to get a job. I went to college to gain knowledge. Period. Yes, there were people pursuing professions (doctor, lawyer nurse, engineer), but it was generally because they were interested in those fields.

I've had a few different jobs with my English degree, but eventually settled in at law firma and there is a sea change with younger employees. They are often smart enough, and are very nice as far as accepting (I particularly see that they are far less prejudiced against gays). But their original thinking is almost non-existent. They want to use tools more than their own thought processes. I do trademark searches. I came up using Boolean logic and other logic based search commands.

People now don't want to do that! They want to use AI based searches and OCR for design searches. Not only are the results inferior, I just don't understand why they are so addicted to shortcuts. Yes, they spit out an acceptable looking product, but the searchers are almost completely detached from the results and don't use any critical thinking to assess the final report.

Don't get me started on their failing letter writing skills (Dear First Name, Last Name, and complete inability to outline the subject matter), or how they all work with headphones or ear buds instead of wanting to interact with their coworkers.

ybbor

(1,699 posts)
52. Neil deGrasse Tyson said something I'll never forget
Wed May 7, 2025, 04:24 PM
May 2025

Well I can paraphrase it anyway. He said that our education system began to fail when we started emphasizing grades over learning. Students stopped being curious about the subject matter, and only were concerned with getting the “A”. They could care less about the content. Therefore they didn’t “learn” it, they got “through” it. It also coincided with teachers’ evaluations being based on the results on test scores. Do you recall the teachers changing the answers on their students’ standardized test to improve their performance, and therefore their evaluations.

It’s a shame. I taught for 9 years later in life, wanting to help make kids appreciative of all that is around them, that is based on math and science, but very few cared. Add cell phones into the mix, and it’s crazy. I quit at COVID, for many reasons, pay, lack of respect, and it’s VERY hard! I went back to construction, where people thank me and I make A LOT more money. And I use math and science.

There are a lot of very good kids out there, but there are also some that don’t care at all about increasing their knowledge of what school is offering.

Demovictory9

(37,113 posts)
54. there is the incentive to pile up degrees and credentials to get a jump in the hiring process
Wed May 7, 2025, 04:28 PM
May 2025

Demovictory9

(37,113 posts)
55. "I spend so much time on TikTok," she said. "Hours and hours, until my eyes start hurting, which makes it hard to plan
Wed May 7, 2025, 04:34 PM
May 2025

She already considered herself addicted to TikTok, Instagram, Snapchat, and Reddit, where she writes under the username maybeimnotsmart. “I spend so much time on TikTok,” she said. “Hours and hours, until my eyes start hurting, which makes it hard to plan and do my schoolwork. With ChatGPT, I can write an essay in two hours that normally takes 12.”

madville

(7,834 posts)
56. AI is going to take many of their jobs
Wed May 7, 2025, 04:35 PM
May 2025

Kind of cutting out the middleman (the human) like corporations are going to do in the coming years.

Demovictory9

(37,113 posts)
57. this is kind of hilarious: students in her Ethics and Tech class used AI to respond to "Briefly introduce yourself and
Wed May 7, 2025, 04:36 PM
May 2025

A philosophy professor across the country at the University of Arkansas at Little Rock caught students in her Ethics and Technology class using AI to respond to the prompt “Briefly introduce yourself and say what you’re hoping to get out of this class.”

Demovictory9

(37,113 posts)
58. "Massive numbers of students are going to emerge from university with degrees, and into the workforce, who are essentia
Wed May 7, 2025, 04:39 PM
May 2025

“Massive numbers of students are going to emerge from university with degrees, and into the workforce, who are essentially illiterate,” he said. “Both in the literal sense and in the sense of being historically illiterate and having no knowledge of their own culture, much less anyone else’s.” That future may arrive sooner than expected when you consider what a short window college really is. Already, roughly half of all undergrads have never experienced college without easy access to generative AI. “We’re talking about an entire generation of learning perhaps significantly undermined here,” said Green, the Santa Clara tech ethicist. “It’s short-circuiting the learning process, and it’s happening fast.”

highplainsdem

(59,439 posts)
65. It's so depressing that it has been happening so fast. I knew it would hurt education, but didn't expect
Wed May 7, 2025, 09:19 PM
May 2025

it to do this much damage in just 2-1/2 years.

pansypoo53219

(22,830 posts)
68. i hated papers, but i almost always did fresh subjects, art school for sure. cannibalism in hs. habbit or acquired taste
Wed May 7, 2025, 11:25 PM
May 2025

ecstatic

(34,993 posts)
73. There are ways to adjust to the new environment
Thu May 8, 2025, 09:35 AM
May 2025

There are things that humans can do that AI cannot get do. Focus on those things to assess knowledge.

Require VERBAL discussions and participation in class or on zoom. You can't really AI that.

I think educational institutions can and will adjust. Anyone who wants to get a degree should probably do so now before any changes are implemented. Lol

GreatGazoo

(4,378 posts)
76. US Colleges are archaic so perhaps this will finally prompt them to update
Thu May 8, 2025, 08:06 PM
May 2025

In Europe exams are oral - they ask you answer. In some countries you can prepare any way you want and then you pay to take proctored exams.

The debt industry was behind the disastrous "college for all" push that demanded college prep in high schools and then evaluated schools based on the percentage of students they sent to colleges. Meanwhile a plumber's apprentice is never in debt or in danger of being replaced by AI, aka chat j'ai pété.

Diraven

(1,776 posts)
77. I think we're rapidly heading to the point
Thu May 8, 2025, 09:09 PM
May 2025

Where we won't be able to trust "educated" professionals with anything. Then they'll have to essentially replace them in these jobs with the AI itself which does all the actual work. The tech-bro billionaires are counting on this. Worst case this will lead to societal collapse, and the billionaires will decide that all the now useless people just need to be gotten rid of, permanently.

Latest Discussions»General Discussion»Everyone Is Cheating Thei...