So what happens if Alexa, Siri or Google Voice gives you bad advice?
It has happened with Wyse map guidance app which sent some Israeli’s on to the wrong road in the West Bank Palestinian territory. They were put in danger among a group of barbarians but survived.
I personally think it is the same as when your doctor, tax preparing CPA, or contractor gives you wrong advice. They are legally liable for the damages that occurred from following their advice.
We treat a whole group of people as reasonable authorities and we expect to be able to act on their advice with confidence.
I think that Alexa, Siri and Google Voice are part of a business that has the same level of authority as any business that purports to give advice, counselling or reliable information.
I see no legal protection for artificial intelligence that exempts it from legal liability for errors arising from its advice or counsel.
A particularly difficult problem arises when Alexa, Siri or Google Voice use Wikipedia which has many errors of fact and conjecture. Wikipedia also has a Lefty bias in many categories of material. Because Wikipedia is an acknowledged crowd created source and is sometimes changing its content, it would be hard to hold Wikipedia legally liable.
Alexa, Siri and Google Voice should probably announce that their source is Wikipedia when it is. They need protection from that particular sinkhole. That is also true for Snopes, a particularly unreliable Lefty online source of information.
All of this will be sorted out in courts over the next decade. I think I see the direction of the law.