'I've Seen the Future of Consumer AI, and it Doesn't Have One' (theregister.co.uk)
- News link: https://tech.slashdot.org/story/18/09/06/205221/ive-seen-the-future-of-consumer-ai-and-it-doesnt-have-one
- Source link: https://www.theregister.co.uk/2018/09/05/consumer_ai_ifa_2018_roundup/
Re: (Score:2)
> Example: Apple will go under...any day now....since 1984
But they've been totally correct in not predicting the "Year of the Linux Desktop" has come.
You win a few and lose a few.
Re: (Score:2)
*nod* to expand on this.... true, Apple never did go under. But look how many computer companies started up around the same time and did. It is fun to look at the successes and compare them to the naysayers who were wrong, but the ones who were right, well, their predictions did not leave much to talk about today.
Re:Now With AI! (Score:5, Informative)
> Gee, I could have sworn we already HAD the AI craze back in the late 80s. Or was it early 90s?
It was the 1980s. It had faded long before 1990.
But there was an earlier AI craze in the 1960s, based on perceptrons. That faded by 1970.
The 1980 AI hype cycle was driven by "expert systems" and "Lisp machines".
The latest cycle started in 2006 with the publication of the [1]seminal paper on deep learning [sciencemag.org], and has so far lasted far longer than any previous AI hype cycle.
[1] http://science.sciencemag.org/content/313/5786/504
Re: (Score:2)
I go way back, too.
AI had an unambiguous definition that eroded under stress because the industry came to the realization that the "I" part (intelligence) used the human mind as the high bar.
The second epiphany came when no one could fabricate an AI that would simply refuse to cooperate if Facebook was unreachable.
Re: (Score:2)
In the 90s it was all "knowledge-based systems" and in the noughties it was all "intelligent agents".
Re: (Score:2)
> In the 90s it was all "knowledge-based systems" and in the noughties it was all "intelligent agents".
Yes, but those generated far less hype than what happened in the 60s, 80s, and teenies.
The big things in the 90s and noughties were the web and e-commerce.
Re: (Score:2)
Thanks for the Lisp reference! I fondly remember learning Lisp in an AI class during college in the 80s. Actually enjoyed programming Lisp because it could be so terse and do so much very rapidly. However, we really had no good applications to use for it, other than having an application learn the best way to win a chess game. I chose not to pursue AI as a career and haven't suffered for that.
Re: (Score:2)
> I chose not to pursue AI as a career and haven't suffered for that.
Learning Lisp would not have helped you. Modern AI uses mostly Python based libraries such as Tensorflow and PyTorch. C++ is used for performance critical stuff. Nobody uses Lisp for AI anymore. It was a dead end.
Fifth Generation (Score:2)
Fueling the hype in the 1980s AI cycle was the Japanese Fifth Generation project, for which a stated goal was to leapfrog the West's computer technology and skills. People like Edward Feigenbaum and Pamela McCorduck used the FUD generated around this project to call for increased funding, claiming in their 1983 book 'The Fifth Generation: Japan’s Computer Challenge to the World' that "America needs a national plan of action, a kind of space shuttle program for the knowledge systems of the future." A
Re: (Score:2)
> Cats on the blockchain, anyone?
Well, at the very least, every zig should be on the blockchain. Don't know about Cats.
You can stop reading at "Orlowski" (Score:4, Interesting)
Andrew Orlowski of The Register is basically a professional dickhead. His main goal seems to be to be as obnoxious and ignorant as possible presumably with the goal of trolling the readership. He's pretty much the reason I stopped reading the Register because of the constant streem of utter bullshit from that guy.
Re:You can stop reading at "Orlowski" (Score:4, Interesting)
And also appears to be climate change denier....
(at least for some of his Register articles.)
Re: (Score:2)
So, Walt Mossberg for a new generation? Shutup!
AI in a Toaster! (Score:2)
Red Dwarf has already shown why this is a BAD Idea.
https://www.youtube.com/watch?v=lhnN4eUiei4
Re: (Score:2)
Please learn basic html K, thanks.
Re: (Score:2)
How about slashdot stop being entirely backwards with that shit instead?
Re: (Score:2)
But I do like being able to verbally ask my phone to navigate to a contact, without having to squint at a screen in the sun, and get turn by turn directions. Digital assistants have slipped into a place in my life where they do a few useful things. As time goes on, this set will grow larger.
But I know: "If it works, it's not AI!" "If it's AI, it won't work!"
Aibo (Score:1)
If Sony's Aibo lives up to the demos I have seen - that would be one big application. AI as a pet.
I also use AI (maybe more ML) all the time with photo sorting, image recognition, etc. It is already in the home.
OP must be joking... (Score:4, Insightful)
... because consumer AI is *ALREADY* ubiquitous and all around us.
From the face detection in your phone, to the fuzzy logic controllers in washing machines, to the ant colony algorithms being used to route network traffic, to finding directions with google maps, to Netflix and Amazon's recommendation algorithms, to OCR for cheques and mail, to NEST thermostats, to robot vacuum cleaners and lawn mowers, to expert systems in medical diagnosis... (I could keep going)
AI in consumer products is literally *already* ALL around us.
Saying that consumer AI "has no future" is like looking around at the world today and saying "personal cars have no future" - it's completely idiotic because to anyone with half an ounce of perception that future is ALREADY here.
It's like looking at a forest and claiming there are no trees
Re: (Score:2)
Yeah it seems like it is a natural fit in optimizing the things we do.
Even though I don't routinely use my phone as an alarm clock, it still knows when i'm likely to get up and if I plug it in at bed time it'll do a good job of figuring out when i'm likely to get up and adjusts its charging rate to be done about an hour before then. Yet if I plug it in a 3pm then it'll assume i want as much charge as possible and charge as fast as it can. It's not rocket science, but it's useful.
Do I need a dishwasher with
Re: (Score:2)
> Do I need a dishwasher with a screen that I can talk to?
Nope, but I'm willing to bet it has an embedded fuzzy logic controller in it to control water levels.
Re: (Score:2)
"Do I need a dishwasher with a screen that I can talk to?" Printers have a screen. You can't talk to it (at least you're not supposed to--when aggravated, I've been know to do so, and not kindly). But try to decipher what's on that screen. I claim that printers are not any easier to use than they were in 1984 (which is when I got my first dot matrix printer). You (ok, I) *still* can't figure out what's wrong with them, despite the screen.
Re: (Score:2)
> None of the tings you mention actually contain any real artificial intelligence in the sense of being able to making decisions in the face of unknown circumstances and data sources.
They do actually.
Roombas have to be able to adapt to unknown obstacles and uncertain sensory input (could get blocked, partially occluded etc...).
Embedded fuzzy logic controllers (also used in anti-lock brakes) have to be able to maintain a steady output signal given uncertain input (wear and tear on the mechanics, grit...) that can vary wildly in an unknown manner.
OCR systems need to be able to tell the difference between a cheque and unknown things, like night club flyers, and they deal with hand written
Re: (Score:1)
Re "face detection" is not AI. Its a really big and fast database. Filled with faces the police know about and random people walking past CCTV.
Re "fuzzy logic controllers in washing machines" A set amount of power, water, weight of laundry is not AI. Just good programming within set limits.
Re "'finding directions" with maps that are created and set.
Re "recommendation algorithms" that is set by past people buying things and another person showing the same interests. More to do with collecting lots
Re: (Score:2)
> ..."face detection" is not AI. Its a really big and fast database. Filled with faces the police know...
...and just HOW do the faces "police know" get matched to this database? Explain without reference to AI.
> ..."fuzzy logic controllers in washing machines" A set amount of power, water, weight of laundry is not AI.
No it isn't, but you're a fool if you think your washing machine is that simple these days. It DOES take fuzzy logic to adapt to things like wear and tear on the machine, arbitrarily changing water pressures and temperatures, etc... and still maintain consistent performance.
> "'finding directions" with maps that are created and set.
...and using AI algorithms to find the best path.
Blah blah blah... you get the point. You've deliberately downplayed the AI aspect
Re: (Score:2)
You are clearly uneducated, Troll.
If you actually wish to enlighten yourself, I'd start here: [1]https://en.wikipedia.org/wiki/... [wikipedia.org]
[1] https://en.wikipedia.org/wiki/Artificial_intelligence
Re: (Score:2)
> From the face detection in your phone, to the fuzzy logic controllers in washing machines, to the ant colony algorithms being used to route network traffic, to finding directions with google maps, to Netflix and Amazon's recommendation algorithms, to OCR for cheques and mail, to NEST thermostats, to robot vacuum cleaners and lawn mowers, to expert systems in medical diagnosis... (I could keep going)
When I took an AI class a few years ago, one of my favorite things the professor said was, "What we called 'AI' yesterday is simply the algorithm for how we do a thing today."
AI's Strength (Score:2)
AI (i.e. machine learning/neural networks) is really good at optimizing stuff, so its natural strength shows when you have hundreds of thousands of entities in a system. Examples are the electricity grid, playing Go, and a department store's inventory.
In our individual lives, AI seems more like another drop in the bucket of too much technology, and I think one day we'll realize that less is more when it comes to the stuff in our homes.
Getting concerned myself (Score:1)
I was looking at new fridges recently as a friend was asking for a recommendation, and it's alarming how trying to find a fridge without a screen is getting to be like trying to find a cell phone without a camera... it really limits your options.
The only way they could make fridges any worse is the if screens also played CNN constantly when not in use, like in an airport... you can absolutely see subsidized ad-fridges coming down the pipeline.
Re: (Score:2)
Seems like only the highest and lowest-end fridges lack screens these days (as well as ice/water in the door, something else I could do without).
Re: (Score:2)
Come to my house. The refr *has* an ice/water dispenser in the door, but it hasn't worked for over a year. I think the tube to the water dispenser is frozen, and if it gets thawed, it just freezes up again. Same with the water dispenser on the refr nearest my office at work.
As for the ice dispenser on our refr, we never used it, so I took it out and got lots more room in the freezer. If we want ice cubes, we make them in trays, like the 1960s.
Re: (Score:3)
> What the hell does a fridge need a screen for?
You can connect it to a webcam inside the fridge and see if the light goes out when you close the door.
I remember a time... (Score:2)
My uncle was a computer scientist for a National Lab. He retired 15 or so years ago. I remember just after my grandmother first got internet, he didn't have it at his home yet because he didn't believe it was safe -this was probably 1997 or 98, and I remember him talking to me about how disappointed he was with the internet. "It was supposed to be this great thing. It's useless. It'll never amount to anything."
Yeah, he was wrong.
Re: (Score:2)
> My uncle was a computer scientist for a National Lab. He retired 15 or so years ago. I remember just after my grandmother first got internet, he didn't have it at his home yet because he didn't believe it was safe -this was probably 1997 or 98, and I remember him talking to me about how disappointed he was with the internet. "It was supposed to be this great thing. It's useless. It'll never amount to anything."
> Yeah, he was wrong.
Was he? Was he really?
How much of the internet is truly useful and how much is just trash? Judging by my inbox, the number of E-mail in my inbox the ratio 1s more than 10 to 1 SPAM to worth while messages (And that's AFTER the SPAM filters.)
I find that this ratio pretty much governs the whole of the internet.. Where 1/10th of it is actually something of use and the rest is just useless junk.
So he's not that wrong.
Re: (Score:2)
And here you (and I) are.
I heard... (Score:1)
AI is turning frogs gay.
Re: (Score:2)
That's actually not true, the frogs are only gay for pay.
Nobody buys something because of AI (Score:3)
I did not see any example where someone says: "I did not buy that product because it lacked AI".
I did not hear from anyone that they need AI so they are going out of their way to buy it. In its current form AI is good for pattern recognition in some cases, for example, face identification in photos.
The only customers are corporations with massive collections of personal data to analyze, but not individual consumers.
I believe AI has been over-hyped and pushed in areas where it is not usable in its current form (like self-driving cars) and we start to see the backlash.
I've already seen stories saying that the medical diagnoses made by IBM's Watson are just plain wrong. More examples will follow.
Re: (Score:2)
> I did not see any example where someone says: "I did not buy that product because it lacked AI".
> I did not hear from anyone that they need AI so they are going out of their way to buy it. In its current form AI is good for pattern recognition in some cases, for example, face identification in photos. The only customers are corporations with massive collections of personal data to analyze, but not individual consumers. I believe AI has been over-hyped and pushed in areas where it is not usable in its current form (like self-driving cars) and we start to see the backlash.
> I've already seen stories saying that the medical diagnoses made by IBM's Watson are just plain wrong. More examples will follow.
What about Google home and Alexa?
How do you recognize pedestrians in self-driving cars without AI?
IBM Watson was wrong quite a bit but it won jeopardy.
First they ignore you, ... (Score:1)
First they ignore you, then they laugh at you, then they fight you, then you win.
Mahatma Gandhi
This field is moving so fast compared to the 90s.
It's a dead end because it's not very good anyway (Score:2)
So-called 'AI' is over-hyped and under-performing.
Another AI winter? (Score:2)
The AI bubble seems to be starting to deflate. It may not pop, but it will likely carry on shrinking. Most people already know that Alex and co. are little more than gimmicks, good for party games, grins and giggles, and little more. The AI community seems to be making the same mistakes they made in the late 60s and 70s. The second AI winter is nigh.
how do you see non-existent things ? (Score:2)
If Consumer AI doesn't have a future, how can that non-existent future be seen ?
In an alternative interpretation, the author has seen the future of Consumer AI and so of course it exists. But the future of the future of Consumer AI doesn't exist. I.e. Future of Consumer AI doesn't have one - where "one" stands for future.
Any other interpretations ?
It's not "Consumer AI" (Score:1)
Since the consumer is not control of it.
It's Anti-Consumer AI if anything
Stupid industry fads (Score:5, Funny)
3D printer in every home will fundamentally change human society
IoT internet connected belt buckles and toothbrushes will take over the world
AI will revolutionize consumer electronics
Net PC from Sun will dominate the computer industry (this one is really old)
Re:Stupid industry fads (Score:5, Insightful)
Excessive hype is always followed by a trough of disillusionment. But as the TOD fades, plenty of mature, practical applications are likely to emerge. The technological naysayers are usually even more wrong than the hypesters.
[1]Hype cycle [wikipedia.org]
[1] https://en.wikipedia.org/wiki/Hype_cycle
Re: (Score:2)
> Excessive hype is always followed by a trough of disillusionment.
Pro Tip: Get out in front and mention this *before* taking your date home. Better for her to hear it from you than her working it out on her own ... :-)
Re:Stupid industry fads (Score:4, Insightful)
If smart phones and tablets are any indicator ...
AI, too, is an evolutionary dead end.
It's a buzz word with a vacuous definition.
Re: (Score:2)
Not a lot different than back in the 1950's when the trend was to create all manor of odd gadgets to make life easier. Those deemed useful are still around... The rest can be found in junk markets around the world. But hey, the Cracker-barrel's of the future will still need stuff to decorate their walls with.
Re: (Score:2)
In reaction to your sig:
I recently re-read "Nineteen Eighty-Four," because my first reading was so long ago.
Good read, but what a goddam depressing book!
Re: (Score:2)
> Excessive hype is always followed by a trough of disillusionment. But as the TOD fades, plenty of mature, practical applications are likely to emerge. The technological naysayers are usually even more wrong than the hypesters.
> [1]Hype cycle [wikipedia.org]
Back in the early PC days, when you had to hook up a cassette player to load your application, and then another one to load your data, we used to tell people they could store recipes on their TRS-80 personal computer. This was not much of a productivity enhancer. I'm sure based on this experience some people would have thought PC's were useless and had no future.
And then floppy disks and spreadsheets were invented.
[1] https://en.wikipedia.org/wiki/Hype_cycle
Re: (Score:2)
It is really difficult to say if the naysayers or hypesters are more often right or wrong. One problem with looking back at negative guesses is we only really remember the ones that turned out to be wrong since the evidence is in modern use today, while all the naysayers that we right, well, the things they were right about faded into obscurity.
Re: (Score:2)
You only count as a "true" naysayer if you are negative about an overhyped trend with groupies and fanbois, not about an obviously stupid idea.
The naysayers were right about the Segway, but that was an easy target, since it reached peak hype before it had even been shown to the public.
Other tech failures were Iridium, Zune, Pebble, Juicero. But none of these were hyped as world changing technology.
Re: (Score:2)
> In the long term only 1/20 companies really make it.
Success of a technology is rarely correlated with the success of particular companies. Silicon Valley is littered with plaques marking the graves of semiconductor pioneering companies. Few of them survived. Yet semiconductors have been the greatest technological success since fire was tamed.
For another example, look at aviation. It took 66 years to go from Kitty Hawk to the Sea of Tranquility. Yet how many airlines made money during those years? Almost none.
Re: (Score:1)
Prognosticators have been wrong before. While it is easy to poke fun at the unusual who knows, perhaps in a few years dental floss will come with AI. The thought of not having AI floss will be unthinkable.
Re: (Score:3, Interesting)
As much as I am a nerd, I blame "nerds" for this. There is this whole new fad of being a "techie", watching Big Bang Theory, owning a Tesla, and generally being absolutely ignorant about real science, technology and math while "pretending" to be a nerd. I used "pretending" but there may be some legitimate attempt but it is hard to tell if someone is a fake nerd or just a stupid nerd. I think this trend partly follows from women trying to follow the (tech) money and then men trying to follow the women.
This
Re: (Score:2)
I don't know that there's a lot of these people but they do exist, for certain yes. The 'watching big bang theory' is the kicker, once someone admits watching that, you know they're very unlikely to be a 'proper nerd' for lack of a better term.
Considering they only have partial skills in technology then, we can likely guess, if they work in the industry, they're probably higher on the ladder than us and paid more though :/ like most management / consultant types.
Re: (Score:3)
The thing no one can consider is time.
"AI" being jammed into things now is probably lame, awkward, and of very limited use. Much like computers were back in the punch card days with devices that. Less than 100 years later we've got computers in our pocket. We are in the early days of AI - we'll look back on it decades from now as we do with things like: [1]https://www.youtube.com/watch?... [youtube.com]
This article is just another example of someone who can't see past their nose to the road ahead and the million differen
[1] https://www.youtube.com/watch?v=Sp7MHZY2ADI
Re: (Score:2)
Good for a few workers over the decade of hype.
Re: (Score:2)
> 3D printer in every home will fundamentally change human society
> IoT internet connected belt buckles and toothbrushes will take over the world
> AI will revolutionize consumer electronics
> Net PC from Sun will dominate the computer industry (this one is really old)
I don't know about home but it plays a big part in manufacturing. There are very specialized and successful medical companies that use 3d printing.
Don't know about belt buckles but fitbit, apple watch, garmin has been worth billions of dollars and fundamentally changed the way a lot of people do things.
I don't know about NetPC but what about the cloud? The hype that we would all put all our stuff in the cloud blah blah actually materialized. There are many companies who own no hardware except the dev la
Re: (Score:2)
Net PC was not from Sun. I should I know, I worked for them during that era. What they had was JavaStation, which was a neat idea but ahead of its time. That concept is now realised by the Chromebook. Net PC was a Compaq thing, if I recall correctly. However, Wikipedia tells me it was Oracle, so perhaps the Compaq device was called something else.