Posts in potpourri
Kitchen robots potpourri


The World's First Home Robotic Chef Can Cook Over 100 Meals

This year, Moley, the first robotic kitchen will be launched by a London-based company, that has unlimited access to chefs and their recipes worldwide.  It is expected to cook and clean up after itself. But looks like, it does not completely eliminate human supervision.

The way this machine works is by specifying the number of portions, type of cuisine, dietary restrictions, calorie count, desired ingredients, cooking method, chef, etc. from the recipe library first. Then, with a single tap, you could choose your recipe, place the individual pre-packaged containers of measured, washed and cut ingredients (that you could order through Moley) on designated spots, and press “start” for the cooking process to begin.

Since the Moley kitchen could essentially cook any downloadable recipe on the internet, the food-robotics-AI startup expects to include a “share and sell” your own recipes feature, where consumers and professional chefs could access and sell their ideas via the “digital style library of recipes” database.

However, there are safety and quality concerns about having a robot-chef. What if the machine chops aimlessly and the owner is left without a meal is a concern raised in the article. Further, cooking involves the chef's personal touch and an engagement of all the five senses, which cannot be realized by a robot. 

Our Robot Overlords Are Now Delivering Pizza, And Cooking It On The Go

To solve the problem of cold pizzas, Zume Pizza, where robots and AI run the show, was started in Mountain View, California was started. 

A customer places an order on the app. A team of mostly robots assembles the 14-inch pies, each of which gets loaded par-baked — or partially baked.



There is only one human worker I the delivery truck  - to drive, slice and deliver to your doorstep. The human does not have to think about when to turn the ovens on and off or what route to take - because these are all decided by AI.  A few minutes prior to arriving at the scheduled delivery destination, the AI starts the oven to finish cooking the order.

Augmented reality kitchens keep novice chefs on track

Japan is not far behind either with regards to the use of robots in cooking. Scientists at Kyoto Sangyo University have developed a kitchen with ceiling-mounted cameras and projectors that overlay cooking instructions on the ingredients. This lets cooks concentrate on their task (e.g., slicing) without having to look up at a recipe book or a screen.

The upgrade from a clasping claw to a classic spinning spatula took a lot of programming but it was necessary. After all, you need the easiest to clean surface when dealing with raw meat — you really don’t want that stuff getting caught up in a device’s various nooks and crannies.

The developers of Flippy is working on a number of new features for the robot, including advanced computer imaging and AI that will help it adapt over time to things like a changing seasonal menu.


Flippy, the hamburger cooking robot, gets its first restaurant gig

Caliburger, a fast food chain based in California is using Flippy to flip hamburgers. Flippy is an industrial robotic arm with a classic spinning spatula.

Suppose you want to fillet a fish. Lay it down on a chopping board and the cameras will detect its outline and orientation so the projectors can overlay a virtual knife on the fish with a line indicating where to cut. Speech bubbles even appear to sprout from the fish’s mouth, guiding you through each step.

The kitchen also comes equipped with a small robot assistant named Phyno that sits on the countertop. When its cameras detect the chef has stopped touching the ingredients, Phyno asks whether that particular step in the recipe is complete. Users can answer “yes” to move on to the next step or “no” to have the robot repeat the instructions.




Robots Cooked and Served My Dinner






In the Chinese city of Kunshan, a small team of robot cooks and waiters serve dumplings and fried rice  at Tian Waike Restaurant.

“A robot can work for seven to eight years and more than ten hours a day,” Song Yugang, the owner of the company that designed the robots said. “Waiters and waitresses work for eight hours every day, nine at most. You need to provide accommodations and meals. But our robots consume three yuan [50 cents, or 30 pence] worth of electricity a day at most.”
AI potpourri: Passenger pickup and suicide prevention

GM just revealed a fully autonomous electric car — and it doesn't have a steering wheel

GM has announced their fourth generation of self-driving vehicles. Note that there is not a single mention of what the passenger is supposed to do in the event that the self-driving algorithm fails!

No driver. No pedals. No steering wheel. Just seats and screens and doors that can close themselves. That’s what riders will see when they get into one of General Motors’ Cruise self-driving electric vehicles, scheduled to hit the road in 2019.




A prominent social scientist, Dr. Peter Hancock aptly stated the following

Today’s new car, a partial robot itself built by robots in an automated factory, may for a time be content to sit in a parking spot and wait for its user’s call. But if people aren’t careful, its fully autonomous cousin may one day drive the joy of driving, or even an entire joy of living, out of human experience.


Would You Send Your Kids To School On A Self-Driving School Bus?

A Seattle-based design firm is working on a six-passenger vehicle picks up and drops off every child at their front door, ensuring their identity with facial recognition.

The vehicle’s AI changes its route based on traffic or other roadblocks, even rejiggering the order in which it drops kids off if, for instance, their parent is running late. And during the rest of the day, each Hannah vehicle can be used to deliver packages, food, or donations, earning school districts extra cash.

But questions remain. Will parents ever trust an autonomous vehicle enough to allow their children to ride in one with no human supervision? And will autonomous technology ever be advanced enough to supervise children, much less cheap enough for school districts to afford? Hannah is a kind of thought experiment: If autonomy is coming to every street, what does getting to school look like?

The researchers at the design firm are also investigating other issues such as how AI will address bullying in buses as well as bringing in extra money to the school by using the bus for food delivery for a service like Uber Eats. 

Canada will track suicide risk through social media with AI

The Canadian government is partnering with an AI firm to predict rises in regional suicide risk. Facebook has also recently launched initiatives to prevent suicides by analyzing posts that suggest suicidal thoughts.

The AI will analyze posts from 160,000 social media accounts and will look for suicide trends.

The AI company aims to be able to predict which areas of Canada might see an increase in suicidal behavior, which according to the contract document includes “ideation (i.e., thoughts), behaviors (i.e., suicide attempts, self-harm, suicide) and communications (i.e., suicidal threats, plans).” With that knowledge, the Canadian government could make sure more mental health resources are in the right places when needed.
AI potpourri: Reading, investing, diagnosis, and retail

A.I. Has Arrived in Investing. Humans Are Still Dominating

AI is taking a bigger role in investing. Large fund management companies like Fidelity and Vanguard say they use AI for a range of purposes.

An exchange-traded fund introduced in October uses A.I. algorithms to choose long-term stock holdings.

It is to early to say whether the E.T.F., A.I. Powered Equity, will be a trendsetter or merely a curiosity. Artificial intelligence continues to become more sophisticated and complex, but so do the markets. That leaves technology and investment authorities debating the role of A.I. in managing portfolios. Some say it will only ever be a tool, valuable but subordinate to its flesh-and-blood masters, while others envision it taking control and making decisions for many funds.

AI has an edge over the natural kind because of the inherent emotional and psychological weaknesses that encumber human reasoning.

While some people see huge potential in AI as an investment advisor, there are others who think that it cannot be relied on for heaving cognitive decision-making.  The following  is a quote from a portfolio manager.

“I’m a fan of automating everything possible, but having a human being push the last button is still a good thin. Hopefully, we all get better and better and smarter and smarter, but there’s something comforting about having an informed human being with sound judgment at the end of the process.”

AI models beat humans at reading comprehension, but they’ve still got a ways to go

AI models designed by Alibaba and Microsoft have surpassed humans in reading comprehension, which demonstrates that AI has the potential to understand and process the meaning of words with the same fluidity as humans. But there is still a long way to go.  Specifically, adding meaningless text into the passages, which a human would easily ignore, tended to confuse the AI.

“Technically it’s an accomplishment, but it’s not like we have to begin worshiping our robot overlords,” said Ernest Davis, a New York University professor of computer science and longtime AI researcher.

“When you read a passage, it doesn’t come out of the clear blue sky: It draws on a lot of what you know about the world,” Davis said. “We really need to deal much more deeply with the problem of extracting the meaning of a text in a rich sense. That problem is still not solved.”



5 ways the future of retail is already here

The retail industry is also starting to rely on AI to shape the way people shop. 

  1. Digital-price displays at grocery stores (e.g., Kroger) now allow retailers to make changes to their prices in one go. 
  2. Digital mirrors are used by retailers such as Sephora and Neiman Marcus to allow shoppers to get feedback on makeup and other items.
  3. Robotic shopping carts can now import your shopping list, guide you to each item in the store, help you to check out, follow you to your car for unloading groceries, and find its way back to a docking station.
  4. Technology is being used by companies like Stitch Fix and American Eagle to recommend outfits to their customers. 
  5. Robots are being used in stores to keep shelves well stocked to help shoppers find what they are looking for.  

Microsoft and Adaptive Biotechnologies announce partnership using AI to decode immune system; diagnose, treat disease

AI and the cloud have the power to transform healthcare – improving outcomes, providing better access and lowering costs. The Microsoft Healthcare NExT initiative was launched last year to maximize the ability of artificial intelligence and cloud computing to accelerate innovation in the healthcare industry, advance science through technology and turn the lifesaving potential of next discoveries into reality.

Each T-cell has a corresponding surface protein called a T-cell receptor (TCR), which has a genetic code that targets a specific signal of disease, or an antigens. Mapping TCRs to antigens is a massive challenge, requiring very deep AI technology and machine learning capabilities coupled with emerging research and techniques in computational biology applied to genomics and immunosequencing.

The result would provide a true breakthrough – a Sequencing the immune system can reveal what diseases the body currently is fighting or has ever fought.
The Year of the Algorithm. AI potpourri, part I: Astronomer, Factory Worker, Musician, and more

2017 seems to have been a watershed year for the use and application of AI and algorithms.  This is part 1 of a two part post highlighting the use (and possible regulation) of AI. 

[NYTimes] An 8th Planet Is Found Orbiting a Distant Star, With A.I.’s Help

NASA announced the discovery of a new exoplanet orbiting a distant star some 2,500 light years away from here called Kepler 90.

The new exoplanet was detected with the help of an artificial intelligence researcher at Google using a machine learning technique called neural networking.

The technology, which is loosely inspired by the human brain, is designed to recognize patterns and classify images.
In many factories, workers look over parts coming off an assembly line for defects.

Andrew Ng, co-founder of some of Alphabet Inc, launches a new venture with iPhone assembler Foxconn to bring AI and so-called machine learning onto the factory floor.

He said he understands that his firm’s technology is likely to displace factory workers but that the firm is already working on how to train workers for higher-skilled, higher paying factory work involving computers.
Bing is working on a system to help users get to the information they are looking for even if they aren’t exactly sure how to find it. For example, let’s say you are trying to turn on Bluetooth on a new device. The new system could prompt users to provide more information, such as the type of gadget or operating system they are using.

Another new, AI-driven advance in Bing is aimed at getting people multiple viewpoints on a search query that might be more subjective.

Microsoft also announced plans to release a tool that highlights action items in email and gives you options for responding quickly on the go.
Researchers at MIT want to get rid of subjective feelings in treatment by using a facial recognition algorithm that can detect your pain levels by studying your face.

Trained on thousands of videos of people wincing in pain, the algorithm creates a baseline for each patient based on common pain indicators – generally, movements around the nose and mouth are telltale signs.

So far, the algorithm is 85% successful at weeding out the fakers. Meaning that people trying to fake pain to get prescription painkillers will soon be out of business.
In the city (London) that spawned David Bowie, Pink Floyd, and the Spice Girls, two college professors are working on an artificial intelligence capable of making its own music. And it’s already played its first show.

The race is on to see whether A.I. can add something meaningful to this cultural activity.

The pair invited a number of musicians to come together for a show called “Partnerships,” a reference to the relationship between human and machine. The show featured a mix of compositions, all performed by humans, with varying levels of input from the A.I. Some compositions took the computer’s work as a starting point, some used the project as inspiration, while others directly played the generated work as it stood.
Artificial intelligence could one day scan the music videos we watch to come up with predictive music discovery options based on the emotions of the performer.

Consumers of the future will rely on computer software to serve them music discovery options. YouTube Red and the YouTube Music app do a good job of serving up new and different options for music discovery, but it’s dragged down by its inability to actually identify what’s playing on the screen. Sure, Google knows which videos you gave a thumbs up to, watched 50 times on repeat, shared on social media, and commented on, but it doesn’t have the visual cues to tell it why.
Macys, CVS, Starbucks, and Sephora turn to AI

If you are scrambling to find last minute gifts, AI/machine learning is here to help!  All the major retailers are now turning to AI to learn what you want.  Big data about retail purchases are being fed into machine learning algorithms to learn things about you.  Here are some examples.  By the way, have you wondered, "what exactly is machine learning?"  Then see the end of this post for an easily digestible video.

[Forbes] Macy's Teams With IBM Watson For AI-Powered Mobile Shopping Assistant

Macy’s is set to launch an in-store shopping assistant powered by artificial intelligence thanks to a new tie-up with IBM Watson via developer partner and intelligent engagement platform, Satisfi.

Macy’s On Call, as it’s called, is a cognitive mobile web tool that will help shoppers get information as they navigate 10 of the retail company’s stores around the US during this pilot stage.

Customers are able to input questions in natural language regarding things like where specific products, departments, and brands are located, to what services and facilities can be found in a particular store. In return, they receive customised relevant responses. The initiative is based on the idea that consumers are increasingly likely to turn to their smartphones than they are a store associate for help when out at physical retail.
If you always have a caramel macchiato on Mondays, but Tuesdays call for the straight stuff, a double espresso, then Starbucks Corporation (SBUX - Get Report) is ready to know every nuance of your coffee habit. There will be no coffee secrets between you, if you’re a Rewards member, and Starbucks.

The chain’s regulars will find their every java wish ready to be fulfilled and, the food and drink items you haven’t yet thought about presented to you as what you’re most likely to want next.

So targeted is the technology behind this program that, if the weather is sunny, you’ll get a different suggestion than if the day is rainy.
Patients tend to be at their local CVS much more frequently than at the doctor. People are also increasingly using fitness trackers like FitBits, smartwatches, and even Bluetooth-enabled scales that are all collecting data patients can choose to share with a provider. All that data isn’t worth much though unless it is carefully interpreted — something Watson can do much more efficiently than a team of people.

A drop in activity levels, a sudden change in weight, or prescriptions that aren’t being filled are the kinds of things that might be flagged by the system. Certain changes could even indicate a developing sickness before someone feels ill — and certainly before someone decides to visit the doctor.

[AdWeek] Sephora Mastered In-Store Sales By Investing in Data and Cutting-Edge Technology

I love Sephora.  As the article aptly states "Sephora isn’t your mother’s makeup company; it’s your modern tech company". I have personally tried the Color IQ, which is their in-store program that scans faces to find out the right shade of foundation and other products for different skin tones. Sephora has an amazing Beauty Insider program that provides it a lot of rich data about their consumers and now the company is leveraging AI to allow customers to virtually try on make-up and spice up their online presence.

Sephora’s innovation lab in San Francisco is tooling with an artificial intelligence feature dubbed Virtual Artist within its mobile app that uses facial recognition to virtually try on makeup products.

[CGP Grey] How do machines learn?

The science behind machine/deep learning neural networks is quite interesting.  For example, the discussion, in the video, about us not knowing what is exactly is being learned is interesting to me (the hidden layer).  But you don't have time for that!  Here is an easily understood video:

Robot potpourri: Concierge, security guard, and VIP greeter
Connie will work side-by-side with Hilton’s Team Members to assist with visitor requests, personalize the guest experience and empower travelers with more information to help them plan their trips.

The more guests interact with Connie, the more it learns, adapts and improves its recommendations. The hotel will also have access to a log of the questions asked and Connie’s answers, which can enable improvements to guests’ experiences before, during and after their stays.

Connie is powered by Watson, a cognitive computing technology platform that represents a new era in computing where systems understand the world in the way that humans do - through senses, learning and experience.









After backlash, animal shelter fires security robot, “effective immediately

The Society for the Prevention of Cruelty to Animals (SPCA) based in San Francisco has been asked to halt the use of their security robot, which they had started using  after experiencing a lot of car break-ins, theft, and vandalism. SPCA also reported that they have seen a decline in the crimes after adopting the robot.  However, some tagged the robot as the "anti-homeless" robot, whose aim was to dislodge homeless campers and whose appearance was considered creepy.

Mitra: The ‘Made in India’ robot that stole the show at GES Hyderabad

The Global Entrepreneurship Summit last year was inagurated with Modi and Trump pressing a button on a robot developed by a startup based in Bangalore, India. 

Variations of the robot are envisioned to be used for customer assistance and therefore projected to increase sales via smart conversations as well as a party photographer, DJ, and live tweeter.

Mitra features a facial recognition technology, allowing the robot to quickly identify the person and deliver the customised services.

The humanoid also understands multiple languages. At the moment, Mitra supports Kannada and English but is soon going to add support for Hindi as well.
Can Robots Address Unethical Issues in Fashion?

The fashion industry is one that is rife with ethical issues at the high end (haute couture, impossible body standards of models) to the low end (fast fashion, manufacturing).  Can robots solve these issues?

[NY Times] Fashion Finds a More Perfect Model: The Robot

This article mainly discusses how fashion is embracing the look of robots.  But could robots soon replace fashion models?

Fashion has been especially quick to seize on the notion that robots are slicker, more perfect versions of ourselves. In the last few months alone, androids have filtered into the glossies and stalked the runways of designers as audacious as Thom Browne and Rick Owens, and of inventive newcomers like David Koma, who riffed on fembot imagery in his fall 2015 collection for Mugler, sending out models in frocks that were patterned with soldering dots and faux computer circuitry.

In a Steven Klein photo shoot in the current Vogue, drones hover overhead, seeming to spy on a party of human models cavorting in a field. For the March issue of W magazine, he portrayed the designer Jason Wu wrapped in the arms of a tin man.

[Reuters] Meet Lulu Hashimoto, the 'living doll' fashion model

Not far behind is Japan, where a doll with the motion of a human is co-existing with humans, is active in the fashion scene, and is being idoloized.

Meet Lulu Hashimoto, a “living doll” and the latest trend in Tokyo’s fashion modeling scene.

Lulu’s ability to blur the line between reality and fiction has mesmerized fans on social media, where the Lulu Twitter and Instagram accounts have drawn tens of thousands of followers.

While popular among fans of Japanese subculture, Lulu is now turning heads at the annual Miss iD beauty pageant where she is among the 134 semi-finalists chosen from around 4,000 entrants.
While automation does take away human jobs, the current frenzy over cheap clothing has created a whole host of unethical labor issues—like the ones that recently caused a factory fire in India killing 13 people—and robots could potentially avert that.

Robots in apparel manufacturing may be good, or they may be bad. They may give us cheap clothes and U.S. jobs (at managerial and administrative level), or they may detrimentally impact the economies of developing nations.
AI potpourri: AI gets a job at NASA, finds serial killers, stops suicide, selects embryos, and interviews you!

[The New Yorker] The Serial-Killer Detector

This article discusses how Thomas Hargrove, a retired journalist who had access to a large collection of murder records created an algorithm that was able to find crime patterns.

He began trying to write an algorithm that could return the victims of a convicted killer. As a test case, he chose Gary Ridgway, the Green River Killer, who, starting in the early eighties, murdered at least forty-eight women in Seattle, and left them beside the Green River.
Facebook’s new “proactive detection” artificial intelligence technology will scan all posts for patterns of suicidal thoughts, and when necessary send mental health resources to the user at risk or their friends, or contact local first-responders. By using AI to flag worrisome posts to human moderators instead of waiting for user reports, Facebook can decrease how long it takes to send help.

It’s also dedicating more moderators to suicide prevention, training them to deal with the cases 24/7, and now has 80 local partners like, National Suicide Prevention Lifeline and Forefront from which to provide resources to at-risk users and their networks.

Misses and false alarms should be factored in when designing the automation algorithm. Too many misses have catastrophic consequences in a high-risk situation. Facebook's AI is an example of an automated system where having misses far outweigh the nuisance of having false alarms. 

[GCN] NASA’s newest employee isn’t human

This article talks about the newest employee at NASA Shared Services Center, Washington, who is a bot. Washington is a rules-based bot and follows a set of rules. NASA expects that future bots will have higher-order cognitive processing abilities.

One of the newest employees at the NASA Shared Services Center can copy and paste text, open emails, move folders and many other tasks. That might sound routine, but the new hire, Washington, isn’t a person — it’s a bot.

Much like a human employee, however, Washington has its own computer, its own email account, its own permissions within applications and its own role within the organization.

The bots, which can run 24/7, can help NASA by taking on time-consuming, manual tasks and allowing its humans to engage in higher level work.
Scientists are using artificial intelligence (AI) to help predict which embryos will result in IVF success.

AI is able to recognise and quantify 24 image characteristics of embryos that are invisible to the human eye. These include the size of the embryo, texture of the image and biological characteristics such as the number and homogeneity of cells.

[New York Post] AI already reads your resume – now it’s going to interview you, too

This article discusses how AI is being used by companies to improve their recruiting process. 

Marriott International Inc. announced the launch of Marriott Careers chatbot for Facebook Messenger, a computer program designed to simulate conversation with job seekers. The virtual assistant aims to create a more personalized, efficient experience for applicants.

“Once you apply for a job, the system sends you updates. If it isn’t available, when another job meets your specific qualifications, you’ll receive a direct message on your digital device,” says Rodriguez, Executive vice president and global chief human resources officer for Marriott. “Generation Z, which is starting to graduate from college, has a strong preference to communicate with companies this way. It’s the wave of the future.”

Unilever is also using AI to narrow down candidates based on their speech, facial expressions and body language.

“Hey Siri, how are my crops doing?” Autonomy in Agriculture Potpourri

Modern agriculture is only possible with the use of advanced technology.  In an upcoming interview, we will learn about what the future of agriculture looks like with highly advanced autonomous systems and how farmers are reacting and coping.

Until then, here are some interesting stories about autonomous systems and agriculture.

[U.S. Department of Agriculture] Smart Phones: The Latest Tool for Sustainable Farming

It is nice to see AI being used to help meet the food demands of a growing world population. For example, the U.S. Department of Agriculture has developed two apps, “LandInfo” and “LandCover,” available on the Google Play Store.

With LandInfo, users can collect and share soil and land-cover information as well as gain access to global climate data. The app also provides some useful feedback, including how much water the soil can store for plants to use, average monthly temperature and precipitation, and growing season length.

LandCover simplifies data collecting for use in land-cover inventories and monitoring. The app automatically generates basic indicators of these cover types on the phone and stores the data on servers that are accessible to users worldwide.





[BBC News] Tell me phone, what's destroying my crops?

AI is also being used in India to help farmers. Drought, crop failure, and lack of accessibility to modern technology make it hard for Indian farmers.   In fact, an estimated 200,000 farmers have ended their lives in the last two decades due to debt.  A group of researchers from Berlin have developed an app called Plantix to help farmers detect crop diseases and nutrient deficiency in their crops.

The farmer photographs the damaged crop and the app identifies the likely pest or disease by applying machine learning to its growing database of images.

Not only can Plantix recognise a range of crop diseases, such as potassium deficiency in a tomato plant, rust on wheat, or nutrient deficiency in a banana plant, but it is also able to analyse the results, draw conclusions, and offer advice.

[Western Farm Press] Smartphones and apps taking agriculture by storm

AI has also given farmers a lot of convenience. They can now perform tasks such as starting or stopping center pivot irrigation systems from the convenience of their home. 

Before I might have to go out in the rain at 2 a.m. to turn off a center pivot or check to make sure it was operating,” says Schmeeckle. “Now I can turn a pivot on or off with my smartphone. I even started one while we were 300 miles away on vacation this summer, and it was still running when I got home.”
Through the IoT, sensors can be deployed wherever you want–on the ground, in water, or in vehicles–to collect data on target inputs such as soil moisture and crop health. The collected data are stored on a server or cloud system wirelessly, and can be easily accessed by farmers via the Internet with tablets and mobile phones. Depending on the context, farmers can choose to manually control connected devices or fully automate processes for any required actions. For example, to water crops, a farmer can deploy soil moisture sensors to automatically kickstart irrigation when the water-stress level reaches a given threshold.

[MIT Technology Review] Six ways drones are revolutionizing agriculture

The market for drone-powered solutions in agriculture is estimated at $32.4 billion. Applications include soil and field analysis, planting, crop spraying, crop monitoring, irrigation, and health assessment,

Agricultural producers must embrace revolutionary strategies for producing food, increasing productivity, and making sustainability a priority. Drones are part of the solution, along with closer collaboration between governments, technology leaders, and industry.
Lettuce Bot is a machine that can “thin” a field of lettuce in the time it takes about 20 workers to do the job by hand.

After a lettuce field is planted, growers typically hire a crew of farmworkers who use hoes to remove excess plants to give space for others to grow into full lettuce heads. The Lettuce Bot uses video cameras and visual-recognition software to identify which lettuce plants to eliminate with a squirt of concentrated fertilizer that kills the unwanted buds while enriching the soil.
Applications of AI Potpourri: AI curbs sex trafficking, outs potentially gay men, avoids pedestrians and more!

Today's potpourri shows the diverse applications of AI, from addressing social problems, creating unanticipated ones, and increasing efficiency in transportation.

[TechRepublic] Apple's autonomous car software uses neural networks to improve navigation, object detection

Apple scientists are working on a new software called VoxelNet that can help self-driving cars identify pedestrians and cyclists by improving Light Detection and Ranging, without adding additional cameras/sensors. Driving is such an integral part of everyday living that Apple's foray into this space is certainly not surprising.

This marks Apple’s first official steps in the autonomous vehicle market, joining companies including Google, Tesla, Uber, and Intel.
Online marketplace AHA partnered with drone company Flytrex on the world’s first fully autonomous drone service in Reykjavik, Iceland. The drone delivery service can cut down delivery time from 25 minutes to four minutes, with a 60% reduction per delivery cost.

[The New Yorker] The A.I. “Gaydar” Study and the Real Dangers of Big Data

AI can now determine your sexual orientation by using parameters such as hair combing style, length of hair, shoes, and polo shirts.  As humans, I think that we should not be making assumptions about people's sexual orientation based on their appearances. How does that help us in anyway? So, why are we now having AI to do that for us?

Kosinski and a colleague, Yilun Wang, had reported the results of a study, to be published in the Journal of Personality and Social Psychology, suggesting that facial-recognition software could correctly identify an individual’s sexuality with uncanny accuracy...When shown two photos, one of a gay man and one of a straight man, Kosinski and Wang’s model could distinguish between them eighty-one per cent of the time
Researchers have developed a new tool which uses machine learning to identify payment patterns in illicit ads on — a site commonly used to host online ads for sex work.

The new system follows peculiar or repeated bitcoin transactions that are likely used in sex trafficking, and gives authorities a heads up of which chains of payments could be signs of crimes. Every transaction on Backpage uses bitcoin.

[Reuters] Sex robots: perverted or practical in fight against sex trafficking?

The argument here is that sexual robots is a great option for lonely people and that in the future brothels will be staffed with robots.

Sex robots can potentially replace prostitutes, reduce sex trafficking and help lonely people.

Experts say the increasingly life-like robots raise complex issues that should be considered by policymakers and the public - including whether use of such devices should be encouraged to curb prostitution and sex trafficking, for sex offenders, or for people with disabilities.
Potpourri: Humorously-runaway AI

Runaway AI is a fear that some researchers have about AI.  While technologically it may be too soon to have this fear, we are coming close.  Here are some recent examples of human-automation or human-AI partnerships running amok with humorous results.

[Sunday Times] Jeremy Clarkson Says He Nearly Crashed While Testing An Autonomous Car (paywalled article); [CarScoops summary]

“I drove a car the other day which has a claim of autonomous capability and twice in the space of 50 miles on the M4 it made a mistake, a huge mistake, which could have resulted in death,” he said. “We have to be very careful legally, so I’m not going to say which one.”
In June, the U.S. Immigrant and Customs Enforcement (ICE) released a letter saying that the agency was searching for someone to design a machine-learning algorithm to automate information gathering about immigrants and determine whether it can be used to prosecute them or deny them entry to the country. The ultimate goal? To enforce President Trump’s executive orders, which have targeted Muslim-majority countries, and to determine whether a person will “contribute to the national interests”—whatever that means.
What I’ve heard is that this is a machine learning problem — that, more or less, for some reason the machine learning algorithm for autocorrect was learning something it never should have learned.
As far as debuts go, there have been more successful ones. During its first hour in service, an automated shuttle in Las Vegas got into an accident, perhaps fittingly the result of a flesh-and-blood human truck driver slowly driving into the unsuspecting robocar, according to a AAA PR representative on Twitter. Nobody was hurt and the truck driver was cited.
Autonomy Potpourri: Evil smart houses, trucker hats, & farming

Upcoming Netflix movie: Evil smart house terrorizes street-smart grifter

I'm sure this movie will give people positive and accurate portrayals of AI/autonomy, and smart home technology; like Sharknado did for weather phenomena/marine life...

Monroe plays a victim who was a street-smart grifter that has been kidnapped and held captive in order to be part of a fatal experiment. The only thing standing in the way of her freedom is Tau, an advanced artificial intelligence developed by her captor, played by Skrein. Tau is armed with a battalion of drones that automate a futuristic smart house.

Trucker hat that alerts of sleepiness

I bet the main issue will be a problem of false alarms, leading to disuse.

Being a trucker means driving huge distances on demanding deadlines. And one of the biggest dangers in trucking is the threat of drivers falling asleep at the wheel. To celebrate 60 years of truck production in Brazil, Ford decided to try to help the problem by creating a hat that tracks head movements and alerts drivers in danger of snoozing.
Driverless tractors, combine harvesters and drones have grown a field of crops in Shropshire in a move that could change the face of farming. From sowing the seeds to tending and harvesting the crop, the robot crew farmed a field of barley without humans ever setting foot on the land in a world first. The autonomous vehicles followed a pre-determined path set by GPS to perform each task, while the field was monitored by scientists using self-driving drones.
Robot potpourri: Nannies, teachers, and companions

Today's collection of potpourri items inadvertently coalesced into how robots are beginning to weave themselves into our lives; literally from infancy to old age.

[Gaurdian] 'This is awful': robot can keep children occupied for hours without supervision

Robots do not have the sensitivity or understanding needed for childcare. 
The use of artificial intelligence to aid student’s learning dates back to 1980s. It was the time when major technology companies like Lego, Leaf, and Androbot introduced the robots to simplify the study and related activities.

Since then, the robotic technology has gone through various changes to become more advance and sophisticated. Meanwhile, a new term, educational robots, was coined for these “classroom robots.”

As I reflect back on my own education, I did best in subjects where I admired and respected my teachers.  If this quality is crucial to student-teacher bonding, it suggests that these robotic teachers need to be designed to elicit such emotions.

[CNBC] A wall-crawling robot will soon teach Harvard students how to code

Harvard computer science professor Radhika Nagpal is hoping to use a robotic toy she helped develop, Root, to teach coding languages like Python and Javascript in her undergraduate courses at Harvard.

The Root prototype is already being used in Harvard research labs. And the Root will be widely available this spring.

[Guardian] In an age of robots, schools are teaching our children to be redundant

Interesting story on how being different from a robot is important to survive in the future workplace but how todays’s schools are designed to produce a 19th century factory workforce.

In the future, if you want a job, you must be as unlike a machine as possible: creative, critical and socially skilled. So why are children being taught to behave like machines?

[Intuition robotics] A companion for the elderly

Older adults value their independence and want to be able to live in the homes where they have lived in for years. Home robots can help them to live independently. These robots can do a number of things, including reminding older adults to take medications to exercising. Check out ElliQ, a robot developed by a start up in Israel for older adults.

ELLI•Q™ is an active aging companion that keeps older adults active and engaged. ELLI•Q seamlessly enables older adults to use a vast array of technologies, including video chats, online games and social media to connect with families and friends and overcome the complexity of the digital world.

ELLI•Q inspires participation in activities by proactively suggesting and instantly connecting older adults to digital content such as TED talks, music or audiobooks; recommending activities in the physical world like taking a walk after watching television for a prolonged period of time, keeping appointments and taking medications on time; and connecting with family through technology like chat-bots such as Facebook Messenger.
Autonomy Potpourri from Around the Web

I'm a bit of a news junkie; I start my day by reviewing the 200 or so blogs I follow on a regular basis.  When I come across an item related to autonomy, I'll collect it into what I call a potpourri post.  Some minor commentary follows.