Oh yeah, it’s definitely useful for that!
Since LLMs are essentially just very complicated probabilistic links between words, it seems to be extremely good at picking the exact word or phrase that even a thesaurus couldn’t get me.
Oh yeah, it’s definitely useful for that!
Since LLMs are essentially just very complicated probabilistic links between words, it seems to be extremely good at picking the exact word or phrase that even a thesaurus couldn’t get me.
I primarily end up using LLMs through DuckDuckGo’s private frontend alongside a search, so if my current search doesn’t yield the correct answer to my question (i.e. I ask for something but those keywords only ever turn up search results on a different, but similar topic) then I go to the LLM and ask a more refined question, that otherwise doesn’t produce any relevant results in a traditional keyword search.
I also use integrated LLMs to format and distill my offhand notes, (and reformat arbitrary text based on specific criteria repeatedly for structured notes,) learn programming syntax more at my own pace and in my own way, and just generally get answers on more well-known topics a lot faster than I would scrolling past 5 pages of SEO-“optimized” garbage just designed to fill time for the ads to load before actually giving me a good answer.
I have never once found an “AI” feature integrated by a corporation useful.
I have only ever found “AI” useful when it’s unobtrusive, and something I chose to use manually. Sometimes an LLM is useful to use, but I don’t need it shilled to me inside a search bar or in a support chat that won’t solve my problem until I bypass the LLM.
I feel like this says more about these students’ schools, rather than the students themselves.
You will eat here, and you will be happy about it! 😡
I find those kinds of chatbots useful, but those aren’t the ones I encounter 90% of the time. Most of the time, it’s a chatbot that summarizes the help articles I just read, giving faulty interpretations of the source material, that then goes on to never direct me to a real person unless I tell it multiple times that the articles it’s paraphrasing aren’t helping. (and sometimes, they have no live support at all, and only an LLM + support articles)