adi_kurian a day ago

Good article though rather than philosphizing about lost souls, we probably should go back to the future.

Pen-and-paper exams only. No take-home essays or assignments. Assessments done in person under supervision. No devices in class. Heavily reduced remote learning or online coursework. Coursework redesigned so that any out-of-class work is explicitly AI-collaborative. Frequent low-stakes in-class writing to verify student voice and baseline ability. And when resources permit have oral exams and presentations as a means of assessment.

We did this for decades when tuition was a fraction of today's cost. Any argument that we can't return to basics is bollocks.

If you're trying to hawk education for $$$$$$, probably need to offer some actual human instruction, not Zoom and Discord sessions that anyone could run from their bedroom.

If they can't, then the rot and capture really is as bad as this makes out, and to update Will Hunting: the kids might as well save $150k and get their learning for $20/month on ChatGPT.

  • vondur 20 hours ago

    This is the correct answer. I work at a CSU (non-faculty) and the issue here is many of the faculty like using online and automated systems to dish out the work and the grading. Going back to doing it the old fashioned way will provoke a pushback from the faculty who will complain about workloads etc...

  • Pet_Ant 21 hours ago

    My parents had oral exams in university. I feel like that actually is a better format that does not rewarding cramming, but is interactive and over quicker. It means that there is a dynamic that actually allows for grading problem solving over regurgitation.

    • ravenstine 21 hours ago

      I agree, though I shudder to imagine how cringey the switchover would be. A significant number of students already had poor diction and linguistic skills when I was in college, and recent evidence shows this situation has likely become worse.

    • RAIN92 15 hours ago

      It's ironic. In Italy we always had constant oral exams (and still do!) from elementary school all the way to Uni. At least 2 per week in high school.

      In an effort to standardize European systems many courses are trying to get rid of them because foreign students are particularly weak in an oral defense.

      Turns out we were right for once :D

    • Der_Einzige 20 hours ago

      Anyone with oral exams was privileged.

      • defrost 19 hours ago

        Privileged enough to have a place at a university, sure.

        That didn't universally equate to privilege in a class or wealth sense for a number of countries.

        eg: https://www.whitlam.org/whitlam-legacy-education

        was the system I was educated under, when I took orals it was a result of being a scruffy kid that wore no shoes but passed general high school and math talent exams better than all but three others my age in the state.

        ( For interest, the three that ramked higher than myself that year in Tertiary admissions exams were all educated in expensive private schools in the capital city- I got by on School of the Air, a bunch of books and a few years at a smallish remote high school in far north W.Australia

        * https://www.aades.edu.au/members/wa

        1970's ham radio running off truck batteries - pre internet for that area, although we did experiment with text over phone line and packet radio.

        * https://en.wikipedia.org/wiki/Prestel

        )

      • Pet_Ant 20 hours ago

        > Anyone with oral exams was privileged.

        No, they really weren't. These were state school's in 1970's eastern Europe. No tuition, and neither parent was from a privileged background.

  • binary132 20 hours ago

    it's almost as though what we need is to entirely ditch the university scam and create something new that more closely resembles what its real purpose was always supposed to be.

  • euroderf 20 hours ago

    In 7th grade history, once a week the class started with a brief (written) quiz.

    Is something like that so hard to do ?

chistev a day ago

Last month, I was listening to the Joe Rogan Experience episode with guest Avi Loeb, who is a theoretical physicist and professor at Harvard University. He complained about the disturbingly increasing rate at which his students are submitting academic papers referencing non-existent scientific literature that were so clearly hallucinated by Large Language Models (LLMs). They never even bothered to confirm their references and took the AI's output as gospel.

https://www.rxjourney.net/how-artificial-intelligence-ai-is-...

nzach a day ago

> As philosopher Peter Hershock observes, we don’t merely use technologies; we participate in them. With tools, we retain agency—we can choose when and how to use them. With technologies, the choice is subtler: they remake the conditions of choice itself. A pen extends communication without redefining it; social media transformed what we mean by privacy, friendship, even truth.

That doesn't feel right. I thought that several groups were against the popularization of writing through the times. Wasn't Socrates against writing because it would degrade your memory? Wasn't the church against the printing press because it allowed people to read in silence?

Sorry for the off-topic.

  • giraffe_lady a day ago

    I'm not that well read on Hershock but I don't think this is a very good application of his tool-vs-tech framework. His view is that tools are localized and specific to a purpose, where technologies are social & institutional. So writing down a shopping list for yourself, the pen is a tool; using it to write a letter to a friend, the pen is one part of the letter-writing technology along with the infrastructure to deliver the letter, the cultural expectation that this is a thing you can even do, widespread literacy, etc.

    Again I think this is a pretty narrow theory that Hershock gets some good mileage out of for what he's looking at but isn't a great fit for understanding this issue. The extremely naive "tools are technologies we have already accepted the changes from" has about as much explanatory power here. But also again I'm not a philosopher or a big Hershock proponent so maybe I've misread him.

    • calvinmorrison a day ago

      It's essentially Dr Ted's theory of small scale vs large scale technology

  • AndrewKemendo a day ago

    That is perfectly on topic and you are identifying correctly flaw in the argument

    Technology is neutral it’s always been neutral it will be neutral I quote Bertrand Russell on this almost every day:

    “As long as war exists all new technology will be utilized for war”

    You can abstract this away from “war” into anything that’s undesirable in society.

    What people are dealing with now is the newest transformational technology that they can watch how utilizing it inside the current structural and economic regime of the world accelerates the already embedded destructive nature of structure and economic system we built.

    I’m simply waiting for people to finally realize that, instead of blaming it on “AI” just like they’ve always blamed it on social media, TV, radio, electricity etc…

    it’s like literally the oldest trope with respect to technology and humanity some people will always blame the technology when in fact it’s not…it’s the society that’s a problem

    Society needs to look inward at how it victimizes itself through structural corrosion, not look for some outside person who is victimizing them

    • MattGrommes 21 hours ago

      > Technology is neutral it’s always been neutral it will be neutral

      I agree with a lot of what you say here but not this. People choose what to make easy and what to make more difficult with technology all the time. This does not make something neutral. Obviously something as simple as a hammer is more neutral but this doesn't extend to software systems.

      • AndrewKemendo 21 hours ago

        > People choose what to make easy and what to make more difficult with technology all the time.

        Right. People choose.

        More specifically people with power direct what technologies get funded. How society chooses who is in power is the primary problem.

  • Der_Einzige 20 hours ago

    BTW that line you are quoting is probably itself AI generated :^)

charlie-83 a day ago

The situation in higher education at the moment does seem pretty dire. However, I do have some hope that a new system could emerge from this which would be better.

The purpose of higher education should be to learn things that will be useful to you (most likely in a career). However, the current purpose is to gain a piece of paper which will mean your job application doesn't get immediately thrown out.

People being willing to spend so much time and money on university only to deliberately avoid learning or thinking by using AI to cheat on everything suggests that the system itself is broken.

These students don't actually want to be in university but feel they have to in order to have a chance at success in the current job market. We are in a prisoner's dilemma where everyone is getting degrees just to be a more appealling applicant than the next person. You might have authored a very impressive opensource library but still not get the junior software dev job because HR never gave your CV to the hiring manager since you don't have a STEM degree and 50 other applicants did.

However, I don't really know how university's will evolve from this or what this new system will be. It seems hard to motivate a bunch of 18 year olds to actually want to learn stuff without dangling a piece of paper and exams at the end. Maybe that's just a symptom of all of the levels of education that come before university also dangling paper and exams. There were certainly parts of my degree I would have, at the time, liked to have skipped with AI but now (older and wiser) I'm very glad I couldn't.

  • flr03 a day ago

    This is simplistic and I believe wrong. People still go to university because they are passionate and want to learn things, exchange with peers, grow as a person.

    Education is not just "buying" a certification to open doors. This part I'm happy to get rid off.

    • charlie-83 a day ago

      I completely agree with you. While I got a piece of paper at the end, I also learned lots of really useful things and met a lot of interesting people. There are still lots of passionate students that want to learn as much as they can.

      But those students aren't going to be using AI to skip all the learning. The article and just about everyone in higher education right now are saying that a large number of students are doing that. So, there must be a large number of students who are primarily motivated by piece of paper (and the job opportunity it provides).

      That doesn't mean that they must be completely disinterested in their subject. They might have some lectures they really like and where they do the coursework properly. However, the epidemic of AI cheating speaks to the inefficiency created by the need for the piece of paper. If someone is essentially skipping 80% of the learning with AI then the job market requiring you to have a piece of paper is causing someone to waste 80% of their time and money. They would be better served by a short course teaching them only that 20% of skills they actually want.

      The social side of things isn't something I was really addressing in this context. To me, that's a bonus of university. Given the cost, it doesn't seem worth going to university primarily for a social experience (unless you live somewhere where it's free). I also really hope that AI isn't affecting these social aspects.

waffletower a day ago

This is such a naive, simplistic, distrusting and ultimately monastic perspective. An assumption here is that university students are uncritical and incapable of learning while utilizing AI as an instrument of mind. I think a much more prescient assessment would be that presence of AI demands a transformation and evolution of university curricula and assessment - and the author details early attempts at this -- but declares them failures and uncritical acquiescence. AI is literally built from staggeringly large subsets of human knowledge -- university cultures that refuse to critically participate and evolve with this development, and react by attempting to deny student access, do not deserve the title "university" -- perhaps "college", or the more fitting "monastery", would suffice. The obsession with "cheating", the fallacy that every individual needs to be assessed hermetically, has denied the reality (for centuries) that we are a collective and, now more than ever, embody a rich mass mind. Successful students will grow and flourish with these developments, and institutions of higher learning ought to as well.

  • ragingregard a day ago

    > This is such a naive, simplistic, distrusting and ultimately monastic perspective

    This is such a disingenuous take on the article, there's nothing naive or simplistic about it, it's literally full of critical thought linking to more critical thought of other academic observers to what's happening at the educational level. The context in your reply implies you read at most the first 10% of the article.

    The article flagged numerous issues with LLM application in the educational setting including

    1) critical thinking skills, brain connectivity and memory recall are falling as usage rises, students are turning into operators and are not getting the cognitive development they would thru self-learning 2) Employment pressures have turned universities into credentialing institutions vs learning institutions, LLMs have accelerated these pressures significantly 3) Cognitive development is being sacrificed with long term implications on students 4) School admins are pushing LLM programs without consultation, as experiments instead of in partnership with faculty. Private industry style disruption.

    The article does not oppose LLM as learning assistant, it does oppose it as the central tool to cognitive development, which is the opposite of what it accomplishes. The author argues universities should be primarily for cognitive development.

    > Successful students will grow and flourish with these developments, and institutions of higher learning ought to as well.

    Might as well work at OpenAI marketing with bold statements like that.

    • waffletower a day ago

      The core premise is decidedly naive and simplistic -- AI is used to cheat and students can't be trusted with it. This thesis is carried through the entirety of the article.

      • ragingregard a day ago

        That's not the core premise of this article, go read the article to the end and don't use your LLM to summarize it.

        The core premise is cognitive development of students is being impaired with long term implications for society without any care or thought by university admins and corporate operators.

        It's disturbing when people comment on things they don't bother reading, literally aligning with the point the article is arguing, that critical thinking is decaying.

      • allturtles a day ago

        So you believe students don't use AI to cheat, and you are calling the OP naive?

        • waffletower 21 hours ago

          That's an utterly hilarious straw man, a spin worthy of politics, and someone else would label, a tautological "cheat". Students "cheated" hundreds of years ago. Students "cheated" 25 years ago. They "cheat" now. You can make an argument that AI mechanizes "cheating" to such an extent that the impact is now catastrophic. I argue that the concern for "cheating", regardless of its scale, is far overblown and a fallacy to begin with. Graduation, or measurement of student ability, is a game, a simulation that does not test or foster cognitive development implicitly. Should universities become hermetic fortresses to buttress against these untold losses posed by AI? I think this is a deeply misguided approach. While I had been a professor myself for 8 years, and do somewhat value the ideal of The Liberal Arts Education, I think students are ultimately responsible for their own cognitive development. University students are primarily adults, not children and not prisoners. Credential provisions, and graduation (in the literal sense) of student populations, is an institutional practice to discard and evolve away from.

        • flag_fagger 21 hours ago

          ChatGPT told them otherwise.

          Seriously, you’re arguing with people who have severe mental illness. One loon downthread genuinely thinks this will transform these students into “genuises”

      • waffletower a day ago

        You can straw man all you like, I haven't used an LLM in a few days -- definitely not to summarize this article -- and what you claim is the central idea, is directly related to my claim. Its very easy to combine them directly: students intellectual development is going to be impaired by AI because they can't be trusted to use it critically. I disagree.

        • gizmo a day ago

          When AI tools make it easy to cruise through coursework without learning anything then many students will just choose to do that? Intellectual development requires strenuous work and if universities no longer make students strain then most won’t. I don’t understand why you think otherwise.

        • ragingregard 21 hours ago

          > You can straw man all you like

          No one is misrepresenting your argument, it's well understood and being argued that it is false.

          > students intellectual development is going to be impaired by AI because they can't be trusted to use it critically.

          This debate is going nowhere so I'll end here. Your core premise is on trust and student autonomy, which is nonsense and not what the article tackles.

          It argues LLM literally don't facilitate cognitive brain development and can actually impair it, irrelevant to how they are used so it's malpractice for university admins to adopt it as a learning tool in a setting where the primary goal should be cognitive development.

          Student's are free to do as they please, it's their brain, money and life. Though I've never heard anyone argue they were their wisest in their teens and twenties as a student so the argument that students should be left unguided is also nonsense.

          • waffletower 21 hours ago

            You said I didn't read the article. That is your weak and petty straw man. Very clearly.

        • awillowingmind 20 hours ago

          I’m not sure how you lived through the last decade and came to the conclusion that people aged 17-25 make rational decisions with novel technologies that have short term gain and long term (essentially hidden) negative side effects.

          • waffletower 19 hours ago

            It seems that 10% of college students in the U.S. are younger than 18, or do not have adult status. The other 90% are adults and are trusted with voting, armed services participation and enjoy most other rights that adults have (with several obvious and notable exceptions -- car rental and legal controlled substance purchase etc.) Are you saying that these adults shouldn't be trusted to use AI? In the United States, and much of the world, we have drawn the line at 18. Are you advocating that AI use shouldn't be allowed until a later cutoff in adulthood? It is not at all definitively established what these "essentially hidden" negative side effects are, that you elude to, and if they actually exist.

  • turzmo 14 hours ago

    No you are wrong. Students use AI not to augment their minds, but to replace the use of them.

  • add-sub-mul-div a day ago

    Even conceding that you, the person reading this comment, will only use AI the right way. With diligence and curiosity. It takes a significant amount of denial not to understand that the majority of people see AI a shortcut to do their job with the least possible amount of effort, or as a way to cheat. These are the people you will be interacting with for the coming decades of your life.

    • waffletower 19 hours ago

      If a student is given a task that a machine can do, and there is some intrinsic value for the student to perform this task manually and hermetically, this value ought to be explained to the student, and they can decide for themselves how to confront the challenge. I think LLMs pose an excellent challenge to educators -- if they are lazily asking for regurgitation from students they are likely to receive machine-aided regurgitation in 2025.

MichaelRazum a day ago

There seems to be two likely outcomes. First the value of education drops, since studying becomes much easier. Second, we will have few young genius level people, who were able to learn very quickly with help of AI.

jeremysalwen a day ago

Did anyone else think there were several transitions that seemed like pure GPTisms?

> This isn’t innovation—it’s institutional auto-cannibalism. The new mission statement? Optimization.

reify a day ago

It seems that the UK government is all in with ai in the class room.

here is a UK .gov study of 21 schools, colleges, academies, universites and technical colleges who have adopted ai.

https://www.gov.uk/government/publications/ai-in-schools-and...

The majority of education providers have yet to adopt AI. Further research with those yet to adopt AI, and/or who are not considering using it, would help us understand better the barriers to more widespread use of AI across different types and phases of education.

For example, the assistant headteacher of one school said the top categories their teachers used in their AI tool were ‘help me write’, ‘slideshow’, ‘model a text’, ‘adapt a text’, ‘lesson plan’ and ‘resource generation’.

  • blibble a day ago

    I am thankful I was awarded my degrees before this crap ever existed

    > As the AI champion told us: “If you put junk in, you’ll get junk out.”

    and if you put gold in, you'll still get junk out

nacozarina 15 hours ago

for every new technology mankind has developed, the safety rules are written in blood, based solely on the series of calamities and inhumanities that followed the invention’s wake

it’s not gonna be any different this time

enceladus06 a day ago

Just use closed-notes pen and paper exams and allow AI use entirely for everything else.

Also women's and gender studies degrees were already a scam unless you have a trust fund.

  • defgeneric 21 hours ago

    I suspect the move back to pen-and-paper exams is being resisted by the teachers. It shouldn't be that hard though--when the workload became to great, most of my own professors would offload part of the grading task to TAs and grad students.

    It does seem like in-person pen-and-paper exams would hold the line pretty firmly with respect to competence. It's a simple solution and I haven't heard any good arguments against it.

ActorNightly a day ago

My experience is with US universities only, but Im glad that they are becoming irrelevant. They are a scam through and through.