In brief You can recall a song’s melody clearly in your mind though the name of it completely slips your tongue. What do you do? Well, now you can hum it directly into your smartphone and Google will try its best to detect what tune it is.
“A song’s melody is like its fingerprint: they each have their own unique identity,” said Krishna Kumar, senior product manager at Google Search, this month. “We’ve built machine learning models that can match your hum, whistle or singing to the right ‘fingerprint’.”
The tech is available via Google Assistant on iOS and Android, and all it needs is a sample of you humming for about ten to fifteen seconds. It only works with English-language tracks at the moment. Similar technology may exist in other apps.
Cruise to test cars with no drivers in SF
Cruise, the San-Francisco-based self-driving car startup bought out by General Motors in 2018, has been given the green light to drive its computer-controlled electric vehicles around the city without any humans inside.
“Before the end of the year, we’ll be sending cars out onto the streets of SF — without gasoline and without anyone at the wheel,” CEO Dan Armann announced. “Because safely removing the driver is the true benchmark of a self-driving car, and because burning fossil fuels is no way to build the future of transportation.”
Cruise had planned to start the autonomous service last year though decided more testing was needed. Now, Armann said, its time had come and the biz would prove its worth on “the chaotic, gritty streets of San Francisco.”
Alphabet-backed Waymo launched a limited robo-taxi service in Phoenix, Arizona, earlier this month.
Non-profit drops out of Partnership of AI
Access Now, a non-profit org focused on fighting for people’s digital rights, has cancelled its membership with the Partnership of AI (PAI), a group backed by several large tech companies to build machine-learning systems responsibly.
PAI receives funding from more than 100 partners, including Facebook, Amazon, Microsoft, and Google, as well as smaller non-profits, to conduct research into the social impacts of AI. The group has published reports on algorithmic bias, facial recognition, and law, though the documentation seems to have little effect, and aren’t directly implemented by any of its partners.
“While we support dialogue between stakeholders, we did not find that PAI influenced or changed the attitude of member companies or encouraged them to respond to or consult with civil society on a systematic basis,” Access Now said in a statement.
“As a human rights organization, we support human rights impact assessments and red lines around use of these technologies, rather than an ethics, risk-based, or sandboxing approach.” ®