• 0 Posts
  • 22 Comments
Joined 5 months ago
cake
Cake day: December 20th, 2024

help-circle

  • I mean we’re talking about kids who are functionally illiterate. The system has failed to teach them this basic skill. Critical thinking about complex and nuanced topics is way beyond that!

    I agree with you there, and I don’t think we’re really all that far off from each other. Writing has both synthetic (the critical thinking to which I referred) and syntactical (what I believe you’re getting at) components to it, and kids have been missing out on the synthetic component for quite a while now and are now beginning to miss more of the syntactical part as a result of AI.

    Where I disagree with you is:

    And the problem is they’re not going to learn the basic skills if they use AI to prevent themselves from doing any work.

    Kids not doing their work didn’t start with AI. LLMs haven’t even been mainstream or otherwise publicly available for three years yet. A lot of these kids were never going to complete coursework in good faith because the curriculum is failing to engage them. Either that, or there are influences in their lives that make it altogether impossible, such as poverty or neurodivergence. In my other comment I was speaking mainly to career readiness, but the principle of meeting students where their circumstances and interests lie applies throughout their time in K-12.

    A trend I’ve noticed in this issue is demonizing students (hence why I keep bringing it up). These kids had nothing to do with their parents putting iPads in front of them instead of reading to them when they were little, or having to take classes that were designed before their parents were born, or so many other observations about the structure of education that make it archaic and broken (perhaps by design, but that’s out-of-scope here). Every stakeholder around this issue should be discussing with each other the ways that school can better serve students; instead, we’ve hastily created a stigma that using AI to complete assignments that you don’t understand, don’t have time for, or simply couldn’t care less about makes you a cheater.

    It is truly a wicked problem, and I believe the way that our leaders haven’t adapted education is primarily to blame. I haven’t even mentioned social media, and I think that government’s inability to regulate it has its share to blame for kids struggling in school. But as problematic as AI is, it is not the reason why this is happening, and we may have to agree to disagree on that point.


  • I may disagree with you that the ability to write alone is where the problem is. In my view, LLMs are further exposing that our education system is doing a very poor job of teaching kids to think critically. It seems to me that this discussion tends to be targeted at A) Kids who already don’t want to be at school, and B) Kids who are taking classes simply to fulfill a requirement by their district— and both are using LLMs as a way to pass a class that they either don’t care about or don’t have the energy to pass without it.

    What irked me about this headline is labeling them as “cheaters,” and I got push-back for challenging that. I ask again: if public education is not engaging you as a student, what is your incentive not to use AI to write your paper? Why are we requiring kids to learn how to write annotated bibliographies when they already know that they aren’t interested in pursuing research? A lot of the stuff we’re still teaching kids doesn’t make any sense.

    I believe a solution cuts both ways:

    A) Find something that makes them want to think critically. Project-based learning still appears to be one of the best catalysts for making this happen, but we should be targeting it towards real-world industries, and we should be doing it more quickly. As a personal example: I didn’t need to take 4 months of biology in high school to know that I didn’t want to do it for a living. I participated in FIRST Robotics for 4 years, and that program alone gave me a better chance than any in the classroom to think critically, exercise leadership skills, and learn soft and hard skills on my way to my chosen career path. I’ve watched the program turn lights on in kids’ heads as they finally understand what they want to do for a living. It gave them purpose and something worth learning for; isn’t that what this is all about anyway?

    B) LLMs (just like calculators, the internet, and other mainstream technologies that have emerged in recent memory) are not going anywhere. I hate all the corporate bullshit surrounding AI just as much as the next user on here, but LLMs still add significant value to select professions. We should be teaching all kids how to use LLMs as an extension of their brain rather than as a replacement for it, and especially rather than universally demonizing it.


















  • sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

    Not arguing against this at all because you’re completely correct, but this feels like a key example of governments being too slow (and perhaps too out of touch?) to properly regulate tech. People clearly like having an algorithm, but algorithms in their current form are a great excuse for tech companies to use to throw their hands up in the air and claim no foul play because of how opaque they are. “It only shows you what you tell it you want to see!” is easy for them to say, but until consumers are given the right to know how exactly each one works, almost like nutrition facts on food packaging, then we’ll never know whether they’re telling the truth. The ability for a tech company to have near unlimited control and no oversight over what millions of people are looking at day after day is clearly a major factor in what got us here in the first place

    Not that there’s any hope for new consumer protections during this US administration or anything, but just something I had been thinking about for a while