Steve Thomas - IT Consultant

Earlier this year, Amazon rolled out a new feature that allowed Alexa device owners to create their own custom skills using preconfigured templates. Today, Amazon is expanding Alexa Blueprints, as the service is called, to include a handful of new templates designed for families and roommates.

These include a chore chart template, a house rules template for roommates, and others.

The Chore Chart template allows families to schedule and track children’s weekly chores, and even lets multiple kids (or anyone, really) compete to see who has done the most. Parents first configure the skill with a list of weekly chores and who those chores are assigned to.

Throughout the week, the kids can log their completed chores by asking Alexa. (“Alexa, ask Chore Chart to log a chore.”). Anyone can then check the progress by asking for the “Chore Score.”

Another blueprint is a variation on the existing “houseguest” and “babysitter” templates, which let you fill in useful information about the home, like where to find the TV remote or what the Wi-Fi password is, for example. The new “Roommate” blueprint, available now, lets you program in other information about the house, like the “house rules.”

You can have Alexa nag users to turn off the lights or run the dishwasher when they ask for the “house rules” for a given room. This passive aggressive roommate shaming system may not be the most useful – unless maybe used to poke fun – however, the template also lets you program in other important contacts, like the landlord or building manager.

The two other new blueprints are more lighthearted in nature.

One, “Whose Turn,” will have Alexa either randomly pick whose turn it is to take on a particular task – like walking the dog – or she can pick from the next name in the list, depending on how it’s configured.

Similarly, the “What To Do” skill will let Alexa make the decision when you’re stumped about what activity to do next. Alexa can pick what movie or TV show to watch from a list you configure, and can even suggest what’s for dinner, if you program in a list of favorite meals. This is also clearly intended more for parents with kids, who like to incorporate Alexa into family discussions and activities, as a third-party arbitrator of disputes, so to speak.

Many of the existing blueprints are already family-friends, like the family jokes, trivia, and stories. Amazon said in June that Alexa Skill Blueprints’ adoption has been higher than expected, when it introduced a way for people to share their custom blueprints with others.

The new blueprints are live now, bringing the total number of customizable skills to 41.

Amazon today publicly launched a new perk for Prime members with young children, with the broad release of the new subscription-based “Prime Book Box” service. The $22.99 per box offering ships Prime members in the U.S. a curated selection of kids’ books every 1, 2 or 3 months, at up to 35% off the list price, Amazon says. The service was first launched in May, but was only available in an invite-only basis at that time.

Members will receive 2 hardcover books or 4 board books per box, depending on the child’s age.

The books chosen are curated by Amazon editors and include a combination of new releases, classics and “hidden gems,” and are tailored to the reader’s age range of “Baby-2,” “3-5,” “6-8,” or “9-12.” For example, some current selections include Amazing Airplanes, Don’t Let the Pigeon Drive the Bus!, Malala’s Magic Pencil, and Nevermoor. 

However, parents can log on to the Book Box site and preview their selections before the box ships, then customize the list as they choose. This would make sense for families with an existing book collection – because their child is older, an avid reader, or because they have hand-me-down books from other children.

If they’re new parents just starting their book collection, they may instead opt to just wait for their shipment, and have the books be a surprise.

The Book Box FAQ also noted that Amazon will use members’ recent purchase history on its site to make sure the box doesn’t include any books the customer had already purchased.

“As a mom who’s spent over 20 years reading and reviewing children’s books, the best part of my job is sharing a love of reading with kids and their families,” said Seira Wilson, Senior Editor, Amazon Books, in a statement about the launch. “Over the past few months, it’s been both exciting and rewarding to hear that Prime Book Box is encouraging kids to spend more time reading. Now that Prime Book Box is available to all U.S. Prime members, I hope we can inspire even more children to discover a love of reading that will last a lifetime.”

The Book Box service is another way for Amazon to retain Prime members – especially the valuable memberships from heads of U.S. households, who are likely to spend more on the retailer’s e-commerce site, as they have more people using the Prime membership.

And, as TechCrunch previously noted, the service will also help Amazon to build a reading profile for the family’s younger members, which can help it to improve its recommendations across the board.

It’s worth pointing out, too, that physical book subscription startups aimed at children have tried and failed to make such a service work, in the past. For example, Sproutkin, The Little Book Club, and Zoobean, are no more.

The challenge for some of these startups was bringing the cost down – something Amazon appears to have managed through its existing publisher relationships. But even in the case of those startups that had offered more affordable plans, they simply didn’t have the reach that Amazon does.

The timing for the startups may have been off, as well – they arrived at a time before we had fully embraced the idea of subscriptions for everything. Today, it’s commonplace.

Plus, Amazon also allows members to control the pace of the shipments further – you don’t have to pay monthly, which can help to attract the more budget-minded shoppers.

Book Box is now one of many subscription boxes Amazon offers. Others include Candy Club, beauty and skin care boxes, STEM Club Toys, and Carnivore Club. It also sells a variety of sample boxes to introduce brands to shoppers.

The service is open today for U.S. Prime members.

 

It’s not just parents who are worrying about their children’s device usage. According to a new study released by Pew Research Center this week, U.S. teens are now taking steps to limit themselves from overuse of their phone and its addictive apps, like social media. A majority, 54% of teens, said they spend too much time on their phone, and nearly that many – 52% – said they are trying to limit their phone use in various ways.

In addition, 57% say they’re trying to limit social media usage and 58% are trying to limit video games.

The fact that older children haven’t gotten a good handle on balanced smartphone usage points to a failure on both parents’ parts and the responsibilities of technology companies to address the addictive nature of our devices.

For years, instead of encouraging more moderate use of smartphones, as the tools they’re meant to be, app makers took full advantage of smartphones’ always-on nature to continually send streams of interruptive notifications that pushed users to constantly check in. Tech companies even leveraged psychological tricks to reward us each time we launched their app, with dopamine hits that keep users engaged.

Device makers loved this addiction because they financially benefited from app sales and in-app purchases, in addition to device sales. So they built ever more tools to give apps access to users’ attention, instead of lessening it.

For addicted teens, parents were of little help as they themselves were often victims of this system, too.

Today, tech companies are finally waking up to the problem. Google and Apple have now both built in screen time monitoring and control tools into their mobile operating systems, and even dopamine drug dealers like Facebook, Instagram and YouTube have begun to add screen time reminders and other “time well spent” features.

But these tools have come too late to prevent U.S. children from developing bad habits with potentially harmful side effects.

Pew says that 72% of teens are reaching for their phones as soon as they wake up; four-in-ten feel anxious without their phone; 56% report that not have their phone with them can make them feel lonely, upset or anxious; 51% feel their parents are distracted by phones during conversations (72% of parents say this is true, too, when trying to talk to teens); and 31% say phones distract them in class.

The problems are compounded by the fact that smartphones aren’t a luxury any longer – they’re in the hands of nearly all U.S. teens, 45% of whom are almost constantly online.

The only good news is that today’s teens seem to be more aware of the problem, even if their parents failed to teach balanced use of devices in their own home.

Nine-in-ten teens believe that spending too much time online is a problem, and 60% say it’s a major problem. 41% say they spend too much time on social media.

In addition, some parents are starting to take aim at the problem, as well, with 57% reporting they’ve set some screen time restrictions for their teens.

Today’s internet can be a toxic place, and not one where people should spend large amounts of time.

Social networking one the top activities taking place on smartphones, reports show.

But many of these networks were built by young men who couldn’t conceive of all the ways things could go wrong. They failed to build in robust controls from day one to prevent things like bullying, harassment, threats, misinformation, and other issues.

Instead, these protections have been added on after the fact – after the problems became severe. And, some could argue, that was too late. Social media is something that’s now associated with online abuse and disinformation, with comment thread fights and trolling, and with consequences that range from teen suicides to genocide.

If we are unable to give up our smartphones and social media for the benefits they do offer, at the very least we should be monitoring and moderating our use of them at this point.

Thankfully, as this study shows, there’s growing awareness of this among younger users, and maybe, some of them will even do something about it in the future – when they’re the bosses, the parents, and the engineers, they can craft new work/life policies, make new house rules, and write better code.

Amazon is today rolling out a set of new features to its Echo Dot Kids Edition devices – the now $70 version of the Echo Dot smart speaker that ships with a protective case and a year’s subscription to Amazon FreeTime, normally a $2.99 per month subscription for Prime members. Now joining the Kids Edition’s parental controls and other exclusive content are new skills from Disney, Hotel Transylvania, and Pac-Man as well as a calming “Sleep Sounds” skill for bedtime.

There are now four new skills that play sounds of thunderstorms, rain, the ocean, or a babbling brook, as well as an all-encompassing “sleep sounds” skill that offers 42 different soothing options to choose from. New parents may be glad to know that this includes baby soothing sounds like cars, trains and the vacuum (don’t knock it until you try it, folks. It works.)

Amazon clarified to us that while there is a version of sleep sounds in the Skill Store today, this version launching on the Kids Edition is a different, child-directed version.

Also new to the Kids Edition is “Disney Plot Twist,” which is like a Disney version of Mad Libs where players change out words and phrases in short adventure stories. The skill features popular Disney characters like Anna, Olaf and Christoff as the narrators and is exclusive to Kids Edition devices.

The new movie “Hotel Transylvania 3: Summer Vacation” is featured in another new skill, Drac’s Pack, which includes monster stories, songs and jokes.

Meanwhile, Pac-Man Stories is a skill that includes interactive stories for the whole family, that work similar to choose-your-own-adventures – that is, the decisions you make will affect the ending.

Both of these are broadly available on Alexa, meaning they don’t require a Kids Edition device to access.

Stories, however, does appear to be one of the areas Amazon is investing in to make its Alexa-powered speakers more appealing to families with young children. The company recently decided to stop working on its chat stories app Amazon Rapids, saying it will instead continue to adapt those Amazon Rapids stories for the Alexa platform.

Amazon also tries to market the Echo Dot Kids Edition to families by making some kid-friendly content, like Disney Plot Twist, available exclusively to device owners.

For example, it already offers exclusive kid skills like Disney Stories, Loud House Challenge, No Way That’s True, Funny Fill In, Spongebob Challenge, Weird but True, Name that Animal, This or That, Word world, Ben ten, Classroom thirteen, Batman Adventures, and Climb the Beanstalk, with this device.

But the Kids Edition can also be confusing to use, because the exclusive skills come whitelisted and ready to go, while other kid-safe skills have to be manually whitelisted through a parents dashboard. And there isn’t enough instruction either from Alexa or in the Alexa app on this process, at present, we found when testing the device earlier.

Unless there’s a specific exclusive skill that parents really want their kids to have, the savings are also minimal when buying the Kids Edition Dot/FreeTime bundle, versus buying a regular Dot and adding on FreeTime separately.

Facebook is making it easier for kids to add their friends on its under-13 chat app, Messenger Kids. Starting today, the company is rolling out a new feature that will allow kids to request parents’ approval of new contacts. To use the feature, parents will turn on a setting that creates a four-word passphrase that’s used generate these contact requests, the company says.

Parents can opt to use this feature, which is not on by default.

Once enabled, Facebook will randomly generate a four-word phrase that’s uniquely assigned to each child. When the child wants to add a friend to their app’s contacts list in the future, they will show this phrase to the friend to enter in their own app.

Both parents will then receive a contact request from their child – and both have to approve the request before the kids can start chatting. In other words, this doesn’t represent a loosening of the rules around parental approvals – all contact requests still require parents’ explicit attention and confirmation, as before.

However, it does make it easier for kids to friend one another when their parents aren’t Facebook friends themselves. That’s been an issue with the app for some time, and one Facebook first started to address in May when it made a change that finally no longer required parents to be friends, too.

While most parents will at least want to know who their child is texting with, there are plenty of times when parents are friendly with someone on a more casual basis – like through the child’s school or their extracurricular activities. But just because two people are neighbors or fellow soccer moms and dads, that doesn’t necessarily mean they’re also Facebook friends.

The change introduced in May allowed parents to do a search for the child’s friend’s parents, then invite them to the app so the kids could connect. But this still required parents to take the initial steps (at the urging of the child, of course). It was also confusing at times, we found when we tried it for ourselves – some parents we connected with couldn’t figure out how the approval process worked, for example.

That being said, it may have helped to give the app’s install base a big boost, along with its expansion outside the U.S. According to data from Sensor Tower, Messenger Kids saw a sizable increase in installs in the beginning of early June and it has just now passed 1.4 million downloads across both iOS and Android. In addition, its daily downloads are around 3x what they were at the end of May.

The passphrase solution will make things a bit easier on parents, because contact requests will be initiated by the kids. Parents will only have to tap a big “Approve” button to confirm the request (or deny it, if the request is inappropriate for some reason.)

The four-word passphrase will only be visible to the child in the Messenger Kids app, and to the parent in their Parent’s Portal.

It’s worth noting that Facebook opted for a passphrase instead of a scannable QR code, as is common in other messaging apps including Facebook Messenger, Snapchat and Twitter, for instance. Facebook says this is so kids can exchange the passphrase without the device being present.

Messenger Kids is a controversial app, but its adoption is growing, the data indicates. Parents have been starved for an app like this – one allowing for conversation monitoring (you just install your own copy) and contact approvals. Whether this will actually indoctrinate a new generation of Facebook or Messenger users is more questionable. It’s likely that when kids outgrow Messenger Kids, they’ll still be switching over to Facebook’s Instagram and Snapchat instead.

The passphrase feature is rolling out starting today on the Messenger Kids mobile app.

Amazon Rapids, the chat fiction that encourages kids to read by presenting stories in the form of text message conversations, is now going free. Previously, Amazon had been charging $2.99 per month for a subscription that allows unlimited access to its story collection, which now numbers in the hundreds.

First launched in November 2016, Amazon Rapids was meant to capitalize on kids’ interest in chat fiction apps like Hooked, Yarn, Tap and others, which tend to cater to a slightly older teenage crowd. Amazon Rapids, meanwhile, was the schoolager-appropriate version, without the swearing, alcohol, sex and yeah, even incest references you’ll find in the Hooked app, for example. (Yuck. Delete.)

Instead, Amazon Rapids’ stories are aimed at kids ages 5 to 12 and generally just silly and fun. They’re not meant to addict kids through the use of cliffhangers and timeouts, nor are they scary.

Some of the app’s stories also serve as crossovers that helped promote Amazon’s kids’ TV shows, like “Danger & Eggs,” and “Niko and the Sword of Light.” These were authored by the shows’ writers, allowing them to extend the show’s universe in a natural way.

In addition, the app included educational features like a built-in glossary and a read-along mode to help younger readers.

However, the app wasn’t heavily marketed by Amazon, and many parents don’t even know it exists, it seems.

According to data from Sensor Tower, Amazon Rapids has been installed only around 120,000+ times to date, three quarters of which are on iOS. (Subscription revenue goes through Amazon, not the app stores, so the firm doesn’t have a figure for that.)

Amazon Rapids is ranked pretty low on the App Store, at No. 1105 for iPhone downloads in the Education category, and No. 1001 on iPad. The highest it ever reached was No. 65 on iPad.

Oddly, it chose not to compete in the “Books” category, where the other chat fiction apps reside, as do the other non-traditional “book” apps, like Wattpad’s crowd-sourced fiction app, Audible’s audiobooks app, various comics apps, and others.

Amazon now says that the hundreds of stories in Rapids will be free going forward. Families can also listen to some of these stories through the Storytime Alexa skill, launched last summer, which includes stories from Amazon Rapids, along with others.

Given Amazon Rapids’ small user base, it’s clear that Amazon no longer believes it makes sense to try to sell subscriptions, and likely now sees its database of stories as more of a value-add for Alexa owners.

That said, it’s unclear what this means for Rapids’ future development and story catalog, which may not continue to grow. (We’ve asked Amazon if it plans to keep adding content to Rapids, and will update if the company responds.)

Kidbox, a subscription clothing box similar to Stitch Fix – but aimed at parents who dislike kids’ clothes shopping (aka all of us) – is now launching its own private label kids’ brands. At launch, the three clothing brands – Miki B., Kid’s Club, and Baby Basics – will join the startup’s over 130 existing brand partners, such as Adidas, DKNY, 7 for All Mankind, Puma, Jessica Simpson, Reebok, Diesel and others.

The company had said earlier this year that it would soon be branching out into its own brands with the arrival of its fall 2018 back-to-school box.

Having sent out its first box of clothing during the back-to-school shopping season in 2016, Kidbox now has two years of data under its belt to inform its designers what kids clothing is selling. Its boxes, similar to Stitch Fix, are put together after parents fill out a profile. The offer their kids’ sizing information, age, and what sort of styles, colors and patterns, they like and hate. Kidbox then preps a box accordingly, and anything the child doesn’t want – or mom or dad don’t want to buy, that is – can be sent back.

However, Kidbox heavily incentives its customers to keep the whole box – it’s around half a dozen items for under $100, which is reasonable. In fact, it can cost more to return items, as you then pay the price on the tag instead of receiving the whole-box discount.

With its new private labels, Kidbox aims to grow its margins further.

“We believe we’ve identified a void in the children’s apparel marketplace,” Kidbox CEO Miki Berardelli told TechCrunch this spring, when referencing its plans to sell its own clothing. “The style sensibility of our exclusive brands will all have a unique personality, and a unique voice that’s akin to how our customers describe themselves. It’s all really based on customer feedback. Our customers tell us what they would love more of; and our merchandising team understands what they would like to be able to procure more of, in terms of rounding out our assortment,” she said.

The company at the time was fresh on the heels of a $15.3 million Series B focused on scaling the business, which included bringing the new lines to its customers.

Kidbox’s brands will focus on the four main personality types of Kidbox shoppers, the company now explains. Miki B. represent a sort of “city cool” aesthetic, while Kid’s Club will encompass sporty athletic, modern casual, and classic preppy styles. Baby Basics, of course, includes baby items.

The lines were created by Kidbox’s own design team, which includes designers from brands like Tory Burch, Burberry, Bonobos, and J.Crew. The team focused on every aspect – like  fabric, color, pattern, and cut. They decided on using 100 percent cotton jersey, so the clothes will hold up and become wardrobe staples.

Each Kidbox shipment will now feature at least one of its own brands, the company says.

In addition to the new brands, Kidbox also teamed up with French Toast on a $68 uniform box for boys and girls that caters to kids whose schools enforce dress codes.

Kidbox today competes with other kids clothing subscription boxes like Rockets of Awesome, Kidpik, Mac & Mia, fabKids, and others. As a parent and customer of a couple of these, what I like about Kidbox is the wearability its items, which tend to be more practical choices, and its affordability. My child likes that the Kidbox often comes with a small surprise – and always includes crayons and stickers, too.

The company declines to share subscriber numbers, but touts 1.2+ million members of its “community” which encompasses social media fans, email subscribers, and paying customers.

The New York-based startup has $28 million to date from Canvas Ventures, Firstime Ventures, HDS Capital, plus strategic partners Fred Langhammer, former CEO of The Estée Lauder Companies Inc., and The Gindi Family, owners of Century 21 department stores.

 

SuperAwesome, the “kidtech” startup valued now at over $100 million, is today launching its own alternative to YouTube’s embedded video player. The technology is aimed at kids publishers – not consumers directly – and is part of the company’s larger platform of kid-safe technology. This includes tools for social engagement, parental controls, advertising, authentication, and more, all specifically designed for companies catering to kids.

The launch comes at a key time in the industry, as YouTube is now the subject of a class-action lawsuit over children’s privacy, and recently had an FTC complaint filed against it by 23 advocacy groups. The complaint says YouTube has been collecting data on children’s viewing patterns for years, in violation of federal law – meaning COPPA, aka the Children’s Online Privacy Protection Act.

The new player provided by SuperAwesome gives kids brands another choice amid all these questions over YouTube and its respect for children’s privacy.

Explains the company, the player does not capture data on children, nor does it breach regulations like COPPA (U.S.) or GDPR-K (E.U.).

The opportunity for SuperAwesome is fairly sizable here. Already, the company counts among its customer base over 190 kids’ brands like Crayola, Topps, Spin Master, Warner Bros., Hasbro, Disney, Roald Dahl, Mattel, Dreamworks, Penguin, and others. These companies use SuperAwesome’s platform and its tools for socially engaging, advertising and connecting with their under-13 audience.

“The demand for [the video player] has come directly from our customers and the player has been in beta testing for a while,” SuperAwesome CEO Dylan Collins tells TechCrunch.

As with its other tools, the kids’ publishers will be able to embed the new player within their own websites and apps, and then manage all their social content – including video – from SuperAwesome’s “PopJam” dashboard.

“To give you a sense of scale, the PopJam Connect platform is enabling tens of millions of kid-safe social engagements every month,” Collins adds.

The platform itself offers a set of basic tools for free, but larger companies pay for premium upgrades on a SaaS (software-as-a-service) basis. Because it’s working with so many big brands, SuperAwesome is now turning a profit. It’s expecting to grow 100 percent this year to reach a revenue run rate of $50 million, it recently said.

And it also just added Tim Weller, chairman of Trustpilot and Taptica, as its Chairman a few months ago, and announced former Upworthy CRO, Ben Zagorski as its North American Chief Revenue Officer.

SuperAwesome’s platform today is addressing an underserved audience: kids brands that need to abide by federal and international regulations around children’s privacy, but have had limited options in terms of technology that helps them do so.

That was the case with video in particular – there hasn’t really been a viable alternative to YouTube’s player that suits kids publishers’ needs.

“There are over 170,000 children going online for the first time every day and the kidtech ecosystem is growing equally quickly to make the broader internet compatible with this new audience,” noted SuperAwesome CTO Joshua Wohle in a statement about the player’s launch. “Many people misinterpreted children’s appearance on the internet as a temporary blip, whereas in reality it is a structural shift that is changing the landscape,” he said.

 

 

 

YouTube highlighted its growth and promised better communication with creators about its tests and experiments, the company announced today in its latest of an ongoing series of updates from CEO Susan Wojcicki focused on YouTube’s top five priorities in 2018. The majority of her missive today – which was also released in the form of a YouTube video – were wrap-ups of other announcements and launches the company had recently made, like the new features released at this year’s VidCon including Channel Memberships, merchandise, and Famebit.

However, the company did offer a few updates related to those launches, including news of expanded merch partnerships. But YouTube didn’t detail the crucial steps it should be taking to address the content issues that continue to plague its site.

YouTube said one way it’s improving communication is via Creator Insider, an unofficial channel started by YouTube employees, which offers weekly updates, responds to concerns, and gives a more behind-the-scenes look into product launches.

In terms of its product updates, YouTube said that Channel Memberships, which are currently open to those with more than 100,000 subscribers, will roll out to more creators in the “coming months.” Meanwhile, merch, which is now available to U.S.-based channels with over 10,000 subscribers, will add new merchandising partners and expand to more creators “soon.”

At present, YouTube is partnered with custom merchandise platform Teespring, which keeps a cut of the merchandise sales while YouTube earns a small commission. The company didn’t say which other merchandise providers would be joining the program.

YouTube’s Famebit, which connects creators and brands for paid content creation, is also growing. YouTube says that more than half of channels working with Famebit doubled their YouTube revenue in the first three months of the year. And it will soon launch a new feature that will allow YouTube viewers to shop for products, apps, and tickets right form the creator’s watch page. (This was announced at VidCon, too.)

Content problems remain

There was little attention given to brand safety in today’s update, however, beyond a promise that this continues to be one of YouTube’s “biggest priorities” and that it’s seeing “positive” results.

In reality, the company still struggles with content moderation. It even fails to follow-up when there’s a high-profile case, it seems. The most recent example of this is YouTube’s takedown of the “FamilyOFive” channel this week.

The channel’s creators, Michael and Heather Martin, are serving probation in Maryland after being convicted of emotionally and physically abusing their children in “prank” videos for their prior DaddyOFive channel. They lost custody of their two younger children as a result.

Unbelievably, the family returned to YouTube as FamilyOFive and FamilyOFive Gaming, and continued to produce videos reaching a combined 400,000+ subscribers. Seemingly without remorse for their past actions, their new channel featured more abuse – one of their children took a shot to their groin in one video, and another was harassed to the point of a meltdown in another.

The family has claimed it’s all “entertainment,” but the justice system obviously disagreed. It’s outrageous that convicted child abusers would be allowed to continue to upload videos of their children to YouTube. The site needs to have much stricter policies not only around bans, but about the use of children in videos entirely. Kids do not have the autonomy to make decisions about whether or not they want to be filmed, and aren’t able to comprehend the long-term impacts of being public on the internet.

While FamilyOFive is an extreme example, YouTube is still filled to the brim with parents exploiting their kids for cash – the stage moms and dads of a new era, raking in the free toys, products, and cash from brands who see YouTube as the new TV, and its creators and their children as the new, less regulated actors.

Unfortunately for children, existing child actor laws that protect children from exploitation and set aside some portion of their earnings outside of parents’ reach haven’t always applied to YouTube stars. YouTube now complies with local child labor laws, it says, but it’s not involved in enforcement. And even with a policy in place, it’s clearly not enough to dissuade parents from filming their kids for cash.

Growth

YouTube’s post today also highlighted other growth metrics. It noted it now has 1.9 billion logged-in monthly users, who watch over 180 million hours of YouTube on TV screens every day. Overall interactions, such as likes, comments and chats, grew by more than 60% year over year, and livestreams increased by 10X over the last three years. Over 60 million users click or engage with Community Tab posts.

YouTube says it answered 600% more tweets through its official Twitter handles (@TeamYouTube, @YTCreators and @YouTube) in 2018 than in 2017 and grew its reach by 30% in the past few months.

And the company noted its plan to expand Stories to those with more than 10,000 subscribers, plus the launches of its new Copyright Match tool, screen time limitation features, and YouTube Studio’s new dashboard which will roll out in 76 languages in the next two weeks.

 

Facebook and Instagram will more proactively lock the accounts of users its moderators suspect are below the age of 13. Its former policy was to only investigate accounts if they were reported specifically for being potentially underage. But Facebook confirmed to TechCrunch that an ‘operational’ change to its policy for reviewers made this week will see them lock the accounts of any underage user they come across, even if they were reported for something else, such as objectionable content, or are otherwise discovered by reviewers. Facebook will require the users to provide proof that they’re over 13 such a government issued photo ID to regain access. The problem stems from Facebook not requiring any proof of age upon signup.

Facebook Messenger Kids is purposefully aimed at users under age 13

A tougher stance here could reduce Facebook and Instagram’s user counts and advertising revenue. The apps’ formerly more hands-off approach allowed them to hook young users so by the time they turned 13, they had already invested in building a social graph and history of content that tethers them to the Facebook corporation. While Facebook has lost cache with the youth over time and as their parents joined, Instagram is still wildly popular with them and likely counts many tweens or even younger children as users.

The change comes in response to an undercover documentary report by the UK’s Channel 4 and Firecrest Films that saw a journalist become a Facebook content reviewer through a third-party firm called CPL Resources in Dublin, Ireland. A reviewer there claims they were instructed to ignore users who appeared under 13, saying “We have to have an admission that the person is underage. If not, we just like pretend that we are blind and that we don’t know what underage looks like.” The report also outlined how far-right political groups are subject to different threshholds for deletion than other Pages or accounts if they post hateful content in violation of Facebook’s policies.

In response, Facebook published a blog post on July 16th claiming that that high-profile Pages and registered political groups may receive a second layer of review from Facebook employees. But in an update on July 17th, Facebook noted that “Since the program, we have been working to update the guidance for reviewers to put a hold on any account they encounter if they have a strong indication it is underage, even if the report was for something else.”

Now a Facebook spokesperson confirms to TechCrunch that this is a change to how reviewers are trained to enforce its age policy for both Facebook and Instagram. This does not mean Facebook will begin a broad sweep of its site hunting for underage users, but will stop ignoring those it comes across.

Facebook prohibits users under 13 to comply with the US Child Online Privacy Protection Act that demands that requires parental consent to collect data about children. The change could see more underage users have their accounts terminated. That might in turn reduce the site’s utility for their friends over or under age 13, making them less engaged with the social network.

The news comes in contrast to Facebook purposefully trying to attract underage users through its Messenger Kids app that lets children ages 6 to 12 chat with those approved by their parents, which today expanded to Mexico, beyond the US, Canada, and Peru. With one hand, Facebook is trying to make under-13 users dependent on the social network while pushing them away with the other.

Child Signups Lead to Problems As Users Age

A high-ranking source who worked at Facebook in its early days previously told me that one repercussion of a hands-off approach to policing underage users was that as some got older, Facebook would wrongly believe they were over 18 or over 21.

That’s problematic because it could make minors improperly eligible to see ads for alcohol, real money gambling, loans, or subscription services. They’d also be able to see potentially offensive content such as graphic violence that only appears to users over 18 and is hidden behind a warning interstitial. Facebook might also expose their contact info, school, and birthday in public search results, which it hides for users under 18.

Users who request to change their birthdate may have their accounts suspended, deterring users from coming clean about their real age. A Facebook spokesperson confirmed that in the US, Canada, and EU, if a user listed as over 18 tries to change their age to be under 18 or vice versa, they would be prompted to provide proof of age.

Facebook might be wise to offer an amnesty period to users who want to correct their age without having their accounts suspended. Getting friends to confirm friend requests and building up a profile takes time and social capital that formerly underage users who are now actually over 13 might not want to risk just to able to display their accurate birthdate and protect Facebook. If the company wants to correct the problem, it may need to offer a temporary consequence-free method for users to correct their age. It could then promote this options to its youngest users or those who algorithms suggest might be under 13 based on their connections.

Facebook doesn’t put any real roadblock to signup in front of underage users beyond a self-certification that they are of age, likely to keep it easy to join the social network and grow its business. It’s understandable that some 9- or 11-year-olds would lie to gain access. Blindly believing self-certifications led to the Cambridge Analytica scandal, as the data research firm promised Facebook it had deleted surreptitiously collected user data, but Facebook failed to verify that.

There are plenty of other apps that flout COPPA laws by making it easy for underage children to sign up. Lip-syncing app Musically is particularly notorious for featuring girls under 13 dancing provocatively to modern pop songs in front of audiences of millions — which worryingly include adults. The company’s CEO Alex Zhu angrily denied that it violates COPPA when I confronted him with evidence at TechCrunch Disrupt London in 2016.

Facebook’s Reckoning

The increased scrutiny brought on by the Cambridge Analytica debacle, Russian election interference, screentime addiction, lack of protections against fake news, and lax policy towards conspiracy theorists and dangerous content has triggered a reckoning for Facebook.

Yesterday Facebook announced a content moderation policy update, telling TechCrunch “There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down. We will be begin implementing the policy during the coming months.” That comes in response to false rumors spreading through WhatsApp leading to lynch mobs murdering people in countries like India. The policy could impact conspiracy theorists and publications spreading false news on Facebook, some of which claim to be practicing free speech.

Across safety, privacy, and truth, Facebook will have to draw the line on how proactively to police its social network. It’s left trying to balance its mission to connect the world, its business that thrives on maximizing user counts and engagement, its brand as a family-friendly utility, its responsibility to protect society and democracy from misinformation, and its values that endorse free speech and a voice for everyone. Something’s got to give.

Messenger Kids, Facebook’s parent-controlled messaging app that lets kids text, call, video chat, and use face filters, has now arrived in Mexico. The launch follows Messenger Kids’ recent expansion outside the U.S., where in June it first became available to users in Canada and Peru. The app in Mexico works the same as it does elsewhere – parents have to approve all the contacts the child is allowed to talk to – whether that’s family members the child knows, like grandma and grandpa, or the child’s friends.

Facebook has consulted with paid advisor Yale Center for Emotional Intelligence and others on the development of Messenger Kids’ features focused on principles of social and emotional learning. For example, it recently introduced a section of guidelines that remind kids to “be kind” and “be respectful” and rolled out “kindness stickers” which are meant to encourage more positive emotions when communicating online.

These approaches are meant to help kids learn, from the beginning, better ways of communicating when online. However, it’s still advisable for parents to sit with kids as they practice texting for the first time, in order to talk about what’s appropriate behavior. As kids gets older, parents should continue to spot check their conversations and have discussions about what the child may have done right or wrong.

For example, we use Messenger Kids in our home, and I recently had a conversation about when it’s too early or too late to be placing a video call, after reviewing the chat history. I then adjusted the app’s “bedtime hours” to limit calls to certain daytime hours. This isn’t something you can do with other social apps.

While the app continues to be controversial because of its maker – Facebook is using it to get kids hooked on its products at a young age – there aren’t any real alternatives for parents who want texting apps for kids with parental controls and friend approvals built in. And even if a startup came up with a similar service, it would be hard to compete with Facebook’s scale.

Today, Messenger Kids has over half a million users across iOS and Android, and is continuing to grow with these international expansions.

 

There’s a special place in Hell for people who think it’s funny to rape a 7-year-old girl’s avatar in an online virtual world designed for children. Yes, that happened. Roblox, a hugely popular online game for kids, was hacked by an individual who subverted the game’s protection systems in order to have customized animations appear. This allowed two male avatars to gang rape a young girl’s avatar on a playground in one of the Roblox games.

The company has now issued an apology to the victim and its community, and says it has determined how the hacker was able to infiltrate its system so it can prevent future incidents.

The mother of the child, whose avatar was the victim of the in-game sexual assault, was nearby when the incident took place. She says her child showed her what was happening on the screen and she took the device away, fortunately shielding her daughter from seeing most of the activity. The mother then captured screenshots of the event in order to warn others.

She described the incident in a public Facebook post that read, in part:

At first, I couldn’t believe what I was seeing. My sweet and innocent daughter’s avatar was being VIOLENTLY GANG-RAPED ON A PLAYGROUND by two males. A female observer approached them and proceeded to jump on her body at the end of the act. Then the 3 characters ran away, leaving my daughter’s avatar laying on her face in the middle of the playground.

Words cannot describe the shock, disgust, and guilt that I am feeling right now, but I’m trying to put those feelings aside so I can get this warning out to others as soon as possible. Thankfully, I was able to take screenshots of what I was witnessing so people will realize just how horrific this experience was. *screenshots in comments for those who can stomach it* Although I was immediately able to shield my daughter from seeing the entire interaction, I am shuddering to think of what kind of damage this image could have on her psyche, as well as any other child that could potentially be exposed to this.

Roblox has since issued a statement about the attack:

Roblox’s mission is to inspire imagination and it is our responsibility to provide a safe and civil platform for play. As safety is our top priority — we have robust systems in place to protect our platform and users. This includes automated technology to track and monitor all communication between our players as well as a large team of moderators who work around the clock to review all the content uploaded into a game and investigate any inappropriate activity. We provide parental controls to empower parents to create the most appropriate experience for their child, and we provide individual users with protective tools, such as the ability to block another player.

The incident involved one bad actor that was able to subvert our protective systems and exploit one instance of a game running on a single server. We have zero tolerance for this behavior and we took immediate action to identify how this individual created the offending action and put safeguards in place to prevent it from happening again. In addition, the offender was identified and permanently banned from the platform. Our work on safety is never-ending and we are committed to ensuring that one individual does not get in the way of the millions of children who come to Roblox to play, create, and imagine.

The timing of the incident is particularly notable for the kids’ gaming platform, which has more than 60 million monthly active users and is now raising up to $150 million to grow its business. The company has been flying under the radar for years, while quietly amassing a large audience of both players and developers who build its virtual worlds. Roblox recently stated that it expects to pay out its content creators $70 million in 2018, which is double that of last year. 

Roblox has a number of built-in controls to guard against bad behavior, including a content filter and a system that has moderators reviewing images, video and audio files before they’re uploaded to Roblox’s site. It also offers parental controls that let parents decide who can chat with their kids, or the ability to turn chat off. And parents can restrict kids under 13 from accessing anything but a curated list of age-appropriate games.

However, Roblox was also in the process of moving some of its older user-generated games to a newer system that’s more secure. The hacked game was one of several that could have been exploited in a similar way.

Since the incident, Roblox had its developers remove all the other potentially vulnerable games and ask their creators to move them over to the newer, more fortified system. Most have done so, and those who have not will not see their games allowed back online until that occurs. The games that are online now are not vulnerable to the exploit the hacker used.

The company responded quickly to take action, in terms of taking the game offline, banning the player and reaching out the mother — who has since agreed to help Roblox get the word out to others about the safeguards parents can use to protect kids in Roblox further.

But the incident raises questions as to whether kids should be playing these sorts of massive multiplayer games at such a young age at all.

Roblox, sadly, is not surprised that someone was interested in a hack like this.

YouTube is filled with videos of Roblox rape hacks and exploits, in fact. The company submits takedown requests to YouTube when videos like this are posted, but YouTube only takes action on a fraction of the requests. (YouTube has its own issues around content moderation.)

It’s long past time for there to be real-world ramifications for in-game assaults that can have lasting psychological consequences on victims, when those victims are children.

Roblox, for its part, is heavily involved in discussions about what can be done, but the issue is complex. COPPA laws prevent Roblox from collecting data on its users, including their personal information, because the law is meant to protect kids’ privacy. But the flip side of this is that Roblox has no way of tracking down hackers like this.

“I think that we’re not the only one pondering the challenges of this. I think every platform company out there is struggling with the same thing,” says Tami Bhaumik, head of marketing and community safety at Roblox.

“We’re members of the Family Online Safety Institute, which is over 30 companies who share best practices around digital citizenship and child safety and all of that,” she continues. “And this is a constant topic of conversation that we all have – in terms of how do we use technology, how do we use A.I. and machine learning? Do we work with the credit card companies to try to verify [users]? How do we get around not violating COPPA regulations?,” says Bhaumik.

“The problem is super complex, and I don’t think anyone involved has solved that yet,” she adds.

One solution could be forcing parents to sign up their kids and add a credit card, which would remain uncharged unless kids broke the rules.

That could dampen user growth to some extent — locking out the under-banked, those hesitant to use their credit cards online and those just generally distrustful of gaming companies and unwanted charges. It would mean kids couldn’t just download the app and play.

But Roblox has the momentum and scale now to lock things down. There’s enough demand for the game that it could create more of a barrier to entry if it chose to, in an effort to better protect users. After all, if players knew they’d be fined (or their parents would be), it would be less attractive to break the rules.