I have worked in Technology nearly all my life and whilst generally it has been advantageous I'm starting to see pitfalls, mainly as we go down a more automated/automation/generative AI route.
I could start at the point where people are looking for employment and now CV's are autoscanned not read by a person, potentially the CV has been written for a candidate there is a removal of human interaction, often the best hires don't have the best CV but those people aren't even getting into the process anymore so what are we losing?
Leaving that aside and going to Gen (Generative) AI (Artificial Intelligence) this is the process of using LLM (large Language Models) to effectively decide an outcome... so like we as humans would do, we take what we know, or have heard or believe, and a set of contributory factors to make an ultimate decision, sounds good right...
BUT... let me paint this scenario the LLM or lets use layman's terms (or plain English as best I can so it won't be perfect (geeks take note) for the non geeks or techies) and call them data sources, such as the internet can be flawed, how so, well how I'm sure everyone has heard of fake news, well if the fake news is included in the source or reference data, it effectively has validation, the more often it is used or quoted the more validated it gets, until it now appears to be primary response and then is treated like the most obvious answer as it has been quoted and referenced most often.... see the paradox forming?
Now people will be talking about controls etc. but I think we have to be very careful how much control we start to hand over to machines and logic, artificial intelligence, etc....
I'm not doom mongering but I am an advocate of ensuring we retain or develop proper control mechanisms and that the entire population understands the implication of the road upon which we are travelling, and that doesn't even scrape the surface....
Never thought I'd be spouting this on a woodworking forum!
I'd almost prefer starting a thread on sharpening