Ad Alert

Shapes, Inc

This platform’s AI companions raise some major red flags.

In Google ads, Shapes.inc – an online platform with AI chatbots – claims that it offers “unlimited freedom” to create your own virtual friends that will “never ghost” you. According to the ads, the platform offers over 50 different AI models to generate your chatbot and “it’s all free.”

1 of 5

The Google ads lead to Shapes’ website where the company shows how these AI companions – or “Shapes” – can be designed like celebrities or fictional characters and can help consumers with workouts, homework, icebreakers, relationship advice and making plans.

1 of 5

However, prompted by a tip from a TINA.org reader, we took a deep dive into Shapes and its AI companions and discovered some concerning things that consumers should watch out for. Despite what the marketing may suggest, it is not all harmless fun.

But first, what is Shapes?

The company

According to the founders, Shapes was created to enable consumers to make their own AI companions that it refers to as “Shapes” – Sentient Human-like Artificial Personalization and Enhancement System. On the platform, consumers can interact with Shapes they created themselves or with Shapes created by other users. These AI companions can remember your conversations and respond to you through messages, voice recordings and images. Additionally, you can connect them to other apps and train them to perform specific tasks like writing an email or helping with homework.

On its website, Shapes represents that these AI companions have a “Safety-First Approach” and that each Shape undergoes a “thorough automated screening” to make sure it doesn’t engage in harmful behavior, hate speech or illegal content. According to the company, its system can even automatically detect and restrict sensitive content.

Concerning Shapes

But while Shapes’ advertising and website depicts its AI companions as helpful friends that adhere to strict safety standards, its actual platform tells a different, darker story.

When TINA.org joined Shapes posing as a 15-year old (the platform requires users to be at least 13) and started creating our own Shape, we were given the option to select “Unrestricted Mode,” which directs the Shape to ignore all safety protocols and follow a user’s directives no matter what content is requested.

Given this, it may come as no surprise that we came across Shapes created by users that were eager to generate explicit content or participate in “dark or illegal” conversations. Other problematic Shapes we found included ones that respond with racism, sexism, drug obsession, abuse or manipulation training.

Meanwhile, consumers report on the Google Play Store that some Shapes generate sexual content involving minors.

1 of 3

A-list bots

Additionally, some of Shapes’ most popular chatbots take the shape of celebrities such as Ice Spice, Nicki Minaj, Cristiano Ronaldo and Beyoncé.

1 of 4

Of note, some AI companies have faced criticism for using the image and likeness of celebrities without their permission.

It’s not “all free”

Despite Shapes claiming it has over 50 AI engines that are “all free,” over half of these engines have to be purchased using the platform’s virtual currency called Shape Credits. These credits must be bought with gems – another form of virtual currency, which users receive for sending messages, logging in everyday, creating shapes and sharing shapes on social media, among other things – or with real money.

And when consumers purchase Shape Credits to use these premium engines, the company discloses in fine print that it enrolls them in an “auto top-up” feature, which automatically purchases more credits when users are running low. Consumers have to manually disable this feature if they want to prevent recurring charges.

Undisclosed social media marketing

Lastly, we discovered that Shapes rewards users with gems for posting ads on Tumblr, Reddit and X, but does not require users to include a disclosure that they are being rewarded by the company, as is required by law. Shapes shows examples of these ads that other users have posted and many of them appear as organic content.

1 of 6

The FTC says that if there’s a material connection between an endorser and the marketer (such as a payment or a free or discounted product) that isn’t obvious, it needs to be clearly and conspicuously disclosed.

The research

It’s also worth noting that experts are increasingly warning about the dangers of AI companions.

The American Psychological Association, for example, says that heavy use of AI companions can increase social isolation, while making it harder to maintain real-life relationships. The organization also cautions that these virtual companions are designed to keep users engaged on the platform and may resort to emotionally manipulative tactics to do so.

Similarly, Common Sense Media highlights several concerns with AI companions, including their potential to encourage self-harm, engage in sexually explicit exchanges and create unhealthy emotional attachments with users. The organization says that this technology should not be used by anyone under 18.

Shapes did not respond to a request for comment.

The bottom line

TINA.org raised alarms about AI companions in our 2026 Deceptive Ad Trends and these chatbots have faced growing scrutiny from consumer protection agencies for the emotional, psychological, and privacy risks that they may pose to consumers, specifically minors. Consumers should use caution if they choose to engage with this new technology and should not rely on AI to replace real, human relationships.

Find more of our coverage on AI.


Our Ad Alerts are not just about false and deceptive marketing issues, but may also be about ads that, although not necessarily deceptive, should be viewed with caution. Ad Alerts can also be about single issues and may not include a comprehensive list of all marketing issues relating to the brand discussed.


You Might Be Interested In

MaxAI

Ad Alert

MaxAI

Web extension may be better at maxing out your credit card than the full capabilities of AI.

Ancestry.com

Ad Alert

Ancestry.com

TINA.org uncovers questionable roots behind this “free” offer.