For almost a decade now, Lioness has been using advanced technology previously only seen in research settings to help everyday women understand the nuance of their arousal and orgasms. While this started as integrating sensors and advanced data filtering, it has since evolved with other technologies to include data visualization and machine learning.
Simply put, the Lioness began as “just” a smart vibrator and has become a full fledged AI vibrator over time.
Okay, but what is AI in the first place?
From Terminator to HER to Mass Effect, popular culture has had a certain way of looking at AI. Whether it’s evil or friendly, AI is anthropomorphized as… well, an artificial version of us.
In reality, even the most advanced LLMs (language learning models) like ChatGPT are statistical models. They have more in common with the linear regressions taught in high school statistics classes than they do with Arnold Schwarzenegger’s characters.
How has Lioness used AI so far?
In the past, we’ve relied on AI algorithms running in the background to help Lioness users “clean up” their data. Whether it’s compensating for user movement — a daily complex problem that even research labs have failed to address well — or filtering out artifacts from how close the sensors are to a powerful motor, we’ve had small “AI’s” running around behind the scenes doing a ton of work.
In 2020, we took it one step further and rolled out our one-of-a-kind hotspot algorithm. This feature (available by default in the Lioness app), uses AI to predict and highlight areas of smart vibrator sessions where orgasm was most likely to occur. What looks like a simple heat map in the app is actually the work of over 30,000 Lioness sessions and an AI-assisted app.

Are AI vibrators ethical?
Since ChatGPT and AI exploded onto the scene, many have speculated on its dangers, most of which aren’t necessarily in the scope of a women’s sexual health company.
And while we cannot speak to how other ai vibrator companies are considering ethics, there are two things that are in our scope and on our radar at Lioness: privacy and power/climate change.
Privacy
Lioness has always been extremely transparent and clear on where we stand on privacy. Your data is anonymized, kept private, and protected with best practices in cybersecurity. We don’t sell or share your data, outside what we describe as “enthusiastic consent”—which as of now, has only been in cases with our research platform where users have opted into sharing data to advance our medical research knowledge of female pleasure.
This is why Mozilla lauded us as going “above and beyond” for protecting privacy and security. Anything that has personal data has some level of risk. However, if we didn’t collect and use personal data… there wouldn’t really be a Lioness vibrator that’s of any use to users.
Ultimately, our highest priority is to collect the minimum possible to give the maximum benefit to our users.
In relation to our use of AI, and the Lioness as an AI vibrator in general, this approach and policy don’t change at all.
Power & Climate change
Since AI started to gain steam, countless commentators have raised the fears of its impacts on our power grids and climate change. Power consumption isn’t that extreme yet, but every ChatGPT query is a few seconds of an air conditioner… which adds up. That being said, power costs have been falling through the floor...

Each generation of LLM has seen prices fall significantly—because underlying power costs have also been falling with more efficient chips.
...at the same time, Meta has been looking for a 4GW nuclear power plant and Microsoft is trying to re-open the shuttered Three Mile Island power plant. This isn’t from power costs for the same queries staying the same—it’s from expected scaling of the number of users accessing it and pushing the boundaries of the AI.
Still, the global landscape of AI power regulation and provision is not really something that a small sexual wellness company can do stuff about. However, we can ask questions about our own responsible use of AI.
There’s a massive difference between transformer models like ChatGPT trained on essentially the entire text base of the web… and our models, which are multiple orders of magnitude smaller. How much smaller?
x.ai trained their new Grok model on 100,000 cutting-edge NVIDIA GPUs. We trained ours on mixes of a Mac Studio and one single old NVIDIA consumer GPU.
OpenAI spends $4B a year on inference costs (meaning running the models, after training). We run ours on a small, power-efficient server that would cost likely maybe $2,000 per year if run from a cloud server provider like AWS.
If our features turn out to be super popular, this will likely scale, but the kind of scale we’re talking about is still night and day. The amount of power we will consume even if we increase our user base by 100X is likely less than an average American home.
And, in this case, we’d hopefully be providing some real benefits in terms of helping people understand their sexuality—and being happier and healthier from it—well exceeding the power consumed. Nevertheless, it is something that we have considered and kept an eye on—and have kept our operations very efficient, both because we care about the impact of it.
The future of Lioness & AI
AI existed far before ChatGPT exploded onto the scene. In fact, some major milestones came years ago, with image recognition suddenly making massive leaps in performance in 2012 (including with a paper authored by researchers including Ilya Sutskever, who would go on to become a co-founder of OpenAI) and hitting human-levels around 2015.
Those leaps were made using deep learning, which were ideas that originated all the way back in the 1940s and 1950s, but finally reached maturity with refinement of techniques, advances in computational power, incredible availability of data. In particular, in 2017, Google published the Transformer model that would become the basis of all Large Language Models (LLMs). (Funny enough, one of our co-founders was also playing with Language Models in 2014).
So what is Lioness working on? We’ll have many exciting announcements, but we can say that it uses deep learning (and some of our models are even using Transformers models directly). Stay tuned by signing up for our newsletter!