This jellyfish robot can outswim its squishy animal cousin
Jellyfish might not be the fastest swimmers under the sea, but what they lack in speed they more than make up for in style. Their energy-efficient wiggle technique has made them the envy of the underwater world. It’s also caught the eye of scientists from North Carolina State University and Temple University.
The team recently created soft robots inspired by jellyfish that can outswim their real-world counterparts. Their average speed of 53.3 millimeters per second will hardly leave Michael Phelps quaking in his Speedos, but the bots have an impressive combination of power and flexibility.
The researchers had previously modeled their droids on cheetahs. Like their animal cousins, the bots were seriously quick. However, their movements were restricted by their stiff inner spines.
“We wanted to make a completely soft robot, without an inner spine, that still utilized that concept of switching between two stable states in order to make the soft robot move more powerfully – and more quickly,” said study author Jie Yin. “And one of the animals we were inspired by was the jellyfish.”
The team emulated the movement of jellyfish with a plastic polymer disk. One layer of the disk was pre-stressed by stretching it in four directions, while another contained a ring-like air channel. Together, they formed a dome shape similar to a jellyfish.
As air is pumped into the channel, the dome quickly curves down, pushing out water to accelerate the bot forward. Check it out in action in the video below:
It’s not only jellyfish and cheetahs that have inspired the researchers. They’ve also developed a fast-moving crawler that moves like a larval insect, curling its body to store up some energy, and then quickly releasing it to jump forward.
Another of their recent creations is a three-pronged gripping robot. While most grippers hang open when resting, and then expel energy to grab and move their cargo, this one stays clenched shut when it’s unused.
“The advantage here is that you don’t need energy to hold on to the object during transport – it’s more efficient,” said Yin.
Why Alexa and not Alexander? How gendered voice assistants are hurting as they help
There’s no shortage of articles lately about how the pandemic has set women back. Since women tend to earn less than men, when the time came to take care of children that could no longer go to school or daycare, women ended up with the job. Working women across the globe were forced to quit their careers to become full-time, stay-at-home moms with all of the caretaking and laundry that comes with it.
For a society that hasn’t quite broken out of its mindset around traditional gender roles, seeing women as everyone else’s helpers instead of their own people with their own destinies is par for the course. We even see this reflected in the emerging field of AI voice assistants – all of which sound female.
“Alexa, why do you sound like a girl?”
Alexa, Siri, Cortana – they’re the latest in a long line of voice assistants that have sounded female. But why?
Well, there’s those deeply entrenched attitudes of society around gender roles that we’ve had to work so hard to undo. And then you’ve surely heard about the ongoing gender discrepancy in STEM fields, where only 12% of AI researchers and one in 10 UK IT leaders are female. When more women are at the table and empowered to speak up, they can raise concerns about these types of things.
To be clear, the rise of gendered technology has been a deliberate decision, one that was dubbed sexist in a 2019 UNESCO report . According to the team behind the Google Assistant, there are technical reasons their 2016 system was feminine, despite initially wanting to use a male voice. Due to biases in their historical text-to-speech (TTS) data, the assistant worked better when using the female voice than it did with the male. And with the pressure of time on them – their go-to-market product was left as solely female.
But why were their past TTS systems trained on biased data in the first place? And why do we seem to care how our phones speak?
Shrill, passive, whiny…
These three words are commonly used to describe the voices of female speakers. They aren’t exactly flattering! Even sociolinguists spent much of the 70s labeling passive linguistic features as ‘women’s speech’, which in turn was described as inferior to the powerful, assertive language used by men.
There’s evidence that using a female voice actually improves user experience. A 2019 study by Voicebot found a consumer preference for synthetic female voices over their male counterparts, with an average rating increase of 12.5%; the opposite was true when human voices were rated.
In summary: people prefer a female voice – but only when it is robotic.
“So my voice assistant’s a girl – so what?”
The problem with voice assistants isn’t just that they all sound female. It’s the passive ‘personalities’ that have been designed for them.
Imagine this: you’re a woman walking down the street, minding her own business. Suddenly, a man drives by and yells out the car window, “You’re hot!” This is obviously unacceptable behavior, and reacting to it would probably mean raising a specific finger.
But if you say the same thing to Alexa, you’d hear “that’s nice of you to say” in response. If you hurled gendered insults, such as b*tch or sl*t, Alexa would simply politely thank you for the feedback.
Deciding the role of an affable, passive, eager-to-please assistant is one best suited to a woman bolsters the tired stereotype of female subservience. You can order Alexa to remind you to take the garbage out, text your mother for you, and turn off the lights, without so much as a ‘please’. What a well-behaved bot she is!
But how does it teach us to treat women better when we can just hurl orders in the general direction of a female-sounding helper?
We’ve seen some progress, but there’s more work to do
It’s not all bad news. Since the UNESCO report was published, Alexa has declared she’s a feminist. The world’s first gender-neutral voice assistant, Q, is being developed to address the issue. And there’s a lot more emphasis on getting women into STEM starting at a very young age which will pay off in more inclusive technology in the years to come.
But there’s a long way to go – a lot of deep-seated biases that we may not even realize we’re carrying have to be undone. The best place to start is by hiring more women and empowering them to call out when something is blatantly sexist. If we all work together, we can make AI that works for everyone.
Stanford uses AI scans of satellite images to track poverty levels over time
A new AI tool can track poverty levels in African villages over time by scanning satellite images for signs of economic well-being.
The tool searches the images for indicators of development, such as roads, agriculture, housing, and lights turned on at night. Deep learning algorithms find patterns in this data to measure the villages’ wealth.
Researchers from Stanford University tested the tool on about 20,000 villages across 23 countries in Africa that had existing wealth data. They say that it successfully estimated the poverty levels of the villages over time.
Identifying these patterns of growth can show why some places are doing better than others. Those insights could help develop social programs that fit a place’s needs.
Filling gaps in data
The researchers believe their system could measure economic well-being in areas where reliable data is lacking.
“Amazingly, there hasn’t really been any good way to understand how poverty is changing at a local level in Africa. Censuses aren’t frequent enough, and door-to-door surveys rarely return to the same people,” Professor David Lobel told Stanford News .
“If satellites can help us reconstruct a history of poverty, it could open up a lot of room to better understand and alleviate poverty on the continent.”
The team envisions government agencies, NGOs, and businesses using the tool to target services and products to specific types of people. It could also help them work out the effectiveness of anti-poverty programs.
Even if it never makes it into any of their hands, the system could deepen our understanding of what affects economic well-being around the world.