Minnesota House is looking at taking social media companies for collecting customer data

The Minnesota House of Representatives reports

When looking at the business model for companies specializing in social media, it might be helpful to steer away from the idea that you’re a “customer” in any conventional sense. Actually, you are more the product.

Such companies make most of their money not by serving you, but rather by serving you up to others, selling what information they can collect on you.

Should the state get a piece of such transactions? Rep. Aisha Gomez (DFL-Mpls) thinks so. That’s why she’s sponsoring HF3117 that would create an excise tax on some businesses that operate social media platforms. The idea is that such companies would have to pay a tax to the state based upon how many Minnesotans use their social media sites.

On Wednesday, the House Taxes Committee laid the bill over, as amended, for possible omnibus bill inclusion.

What would that look like…

The excise tax would be based on the number of Minnesota resident consumers on which a company collects consumer data. The tax would be imposed at graduated rates between 10 and 50 cents per consumer per month based on the number of state users in a month the company collects data.

The Revenue Department estimates that the bill’s tax changes would increase the state’s General Fund by $45.5 million in fiscal year 2026, climbing to $92.7 million in fiscal year 2027. It also estimates that, for fiscal year 2026, 14 social media platforms would be subject to the tax.

New bill SF3056 would change tax social media companies for collecting data on MN consumers

Law 360 reports...

Minnesota would impose a tax on consumer data collection done by social media platforms based on the number of Minnesota consumers, if the platform has more than 100,000 consumers, under a bill introduced in the state Senate.
S.F. 3065, which Sen. Ann Rest, DFL-Hennepin County, introduced Thursday, would impose a tax on social media companies that collect data from Minnesota consumers. The monthly tax would be 10 cents per Minnesota consumer on more than 100,000 consumers and up to 500,000 consumers; 25 cents per Minnesota consumer on more than 500,000 consumers and up to 1 million consumers; and 50 cents per consumer on more than 1 million consumers.
The bill would define a consumer as someone with an account on the social media application or website.

MN HF1289: bill requiring mental health warning labels Social media to be discussed March 20, 2025

The MN House reports on

HF1289 (Stephenson); Social media platforms required to post a mental health warning label and timer notifications.

To be discussed next week…

Commerce Committee

Co-Chairs: Rep. Kaohly Her and Rep. Tim O’Driscoll

Thursday, March 20, 2025 – 8:15AM

Room 120, Capitol Building

 

Co-Chair Her holds the gavel

 

AGENDA

Call to Order

Approval of the Minutes – March 19, 2025

 

  • HF1392 (Lee, K.); Consumer protection restitution account established, report required, and money appropriated
  • HF2233 (Niska); Uniform Special Deposits Act adopted.
  • HF695 (Finke); Places of entertainment required to provide access to potable water at events.
  • HF2228 (Elkins); Task force on homeowners and commercial property insurance established, report required, and money appropriated.
  • HF1289 (Stephenson); Social media platforms required to post a mental health warning label and timer notifications.
  • Adjournment

Bills may be added, removed, or taken up in any order at the discretion of the Co-Chairs

If you would like to testify or provide handouts to the committee please email simon.brown@house.mn.gov and jonathan.cotter@house.mn.gov no later than March 19, at 1:00pm.

Please note that this list serv notification email is not monitored.  If you wish to reply, please email me directly at: simon.brown@house.mn.gov

EVENT Mar 6: MN House has discussion and presentation on AI, Deepfake Technology, and Social Media Algorithms

I think this will be an interesting both for learning about new media and AI but also for learning what the Legislature knows about technology…

Commerce Finance and Policy
Chair: Rep. Tim O’Driscoll
Location: Capitol 120
Agenda:

I. Call to Order
II. Approval of the Minutes
III. Discussion and Presentation on Social Districts
IV. Discussion and Presentation on AI, Deepfake Technology, and Social Media Algorithms
V. Adjournment

Attachments:

Schedule.

A look at social media use and tools by teens

Pew Research Center reports

Amid national concerns about technology’s impact on youth, many teens are as digitally connected as ever. Most teens use social media and have a smartphone, and nearly half say they’re online almost constantly, according to a new Pew Research Center survey of U.S. teens ages 13 to 17 conducted Sept. 18-Oct. 10, 2024.

A line chart showing that YouTube, TikTok, Instagram and Snapchat top the list for teens

YouTube tops the list of the online platforms we asked about in our survey. Nine-in-ten teens report using the site, slightly down from 95% in 2022.

TikTok, Instagram and Snapchat remain widely used among teens. Roughly six-in-ten teens say they use TikTok and Instagram, and 55% say the same for Snapchat.

Facebook and X use have steeply declined over the past decade. Today, 32% of teens say they use Facebook. This is down from 71% in 2014-15, though the share of teens who use the site has remained stable in recent years. And 17% of teens say they use X (formerly Twitter) – about half the share who said this a decade ago (33%), and down from 23% in 2022.

Roughly one-quarter of teens (23%) say they use WhatsApp, up 6 percentage points since 2022.

And 14% of teens use Reddit, a share that has remained stable over the past few years.

We asked about Threads, launched by parent company Meta in 2023, for the first time this year. Only 6% of teens report using it.

MN schools and teachers using online translation services to communicate with families

Gov Tech reports

Some Minnesota educators have signed onto apps and platforms that use machine-learning algorithms to help translate websites, newsletters and even texts to parents into multiple languages.

The article reports on some high points and low points. It is a way to get information into many languages but without some quality control, you can’t be sure that the correct and full information is being shared. Ther are some plans for improvement…

After winter break, the district plans to roll out an app that will connect teachers to a live interpreter to help interpret conferences, or parent meetings, for example.

“That’s going to be such a resource for us,” said Danilo McCarthy , an English language support specialist for South Washington County schools. He also holds workshops for families on using those new technologies to best communicate with their child’s teacher.

Kourajian, the Mounds View middle school teacher, offered similar training for her fellow teachers on the translation app — called TalkingPoints — that allows her to translate quick messages to and from parents. The district first offered TalkingPoints to its staff in the 2022-23 school year, but it wasn’t immediately embraced by all, Kourajian said.

Usage has jumped this year and more than 23,000 messages have gone to families, most commonly in Spanish, Somali and Arabic. Still, only about 40 percent of staff are using the app.

Many years ago, I taught English in Catalonia, Spain. Lack of translation tools meant teacher, students and parents all learned quickly to communicate when necessary but some help would have been a gift. The old school version of AI was the boss writing down a dozen phrases in Catalan that I could use on report cards.

From St Paul to Chicago in a driverless car: firsthand observations through a rural broadband lens

Driverless cars are here. I know because my brother recently bought a 2025 Model Y and we “drove” it from St Paul to Chicago. So many people asked me about the drive, so I thought I’d share here.

The car is driverless, but not autonomous, which means he has to pay attention the whole time, but he doesn’t have to touch the steering wheel or accelerator – unless he wants to. I’ll tell you the coolest part first – then the mundane details. You can get the car to pick you up with no one in the car. It’s short distance (maybe 250 feet), but you could park at the grocery store and have the car pick you up at the door. Just push a button. (Videos below.)

To have the car “take the wheel” you type in an address and go. You can set parameters such as top speed or choose a setting chill, aggressive or Mad Max. (My brother may have made up those terms, but you get it.) At any time, you can take over the steering wheel or accelerate/deaccelerate. In practice you don’t have to grab the wheel but just touch the indicator and it will change lane or move over.

Highway driving seems very easy. Even off and on ramps are smooth. My brother took control once when a semi turned right into our lane to avoid a slow car. Residential driving is slow. I live on a St Paul street with on street parking and lots of family homes. Cars parked on either side of the narrow street make the car much slower. But the odds of a toddler chasing a ball are higher here than other areas. Driving on rural county roads was trickiest.

 

I could compare the driverless car to me driving down a winding county road – except I might put on the brake for turns. The car doesn’t appear to see the yellow warning sides, such as suggested speeds for curves and turns. Also, it doesn’t read the yellow dividing lines well if one side is solid and one is broken. It will only read the side its on. But the car gets smarter every day with AI. So I imagine this would get smoothed out the more you drove a country road; also, the cars collectively get smarter so the more you drove the road, the smarter all Teslas get.

Bringing it around to broadband, the question I had was the impact of slower or spotty broadband. The car will still be “drivable” but won’t know where to go because broadband facilitates the following:

  • The car collects data to report back to the collective Tesla brain (AI); without broadband, date will not be uploaded.
  • The car downloads regular updates with broadband; without broadband updates don’t happen.
  • The car uses Google maps for directions, without broadband it won’t be able to go to an address. The car won’t stop and the 8 cameras around the car help the car not go through a red light or hit anything.

Looking at the economic impact of electric cars in rural areas I noticed two things:

  • Again, the map reads Google maps. Make sure your business is on online maps if you want to be seen. (I couldn’t actually see businesses along the way but that can’t be far behind – and might even just be a setting we didn’t use.)
  • Electric cars need charging stations. We stopped twice. It’s about 20 minutes for a full charge. Proximity to the charging station was the only criteria for where we had lunch and could have had snacks.
  • And based on the observation above – recognize that yellow signs are not seen.

If you get a chance to ride in a driverless car – take it. It’s fun. If/when it snows and slushes, I’ll try again and report back. I’m curious to see how that cameras handle that!

Senators Klobuchar and Warner ask tech companies to address election-related misinformation

Senator Klobuchar reports

 U.S. Senator Amy Klobuchar (D-MN), Chairwoman of the Senate Committee on Rules and Administration with oversight over federal elections, along with Senator Mark Warner (D-VA), Chairman of the Senate Select Committee on Intelligence, sent a letter to the CEO of Meta, Mark Zuckerberg; CEO of X, Linda Yaccarino; CEO of Alphabet, Sundar Pichai;  CEO of Twitch, Daniel Clancy; and CEO of Discord, Jason Citron to highlight the risks of election-related disinformation on their platforms, and to urge the companies to take decisive action, including bolstering content moderation resources, to combat deceptive content intended to mislead voters or sow violence.

“Your companies are on the frontlines of the risks to our democracy posed by online disinformation and technology-enabled election influence, and it is for these reasons that we urge you to prioritize taking action to ensure that you have the policies, procedures, and staff in place to counter and respond promptly to these threats,” said the lawmakers.

As Chair of the Rules Committee, Senator Klobuchar has worked on a bipartisan basis to safeguard our elections and strengthen democracy.

In May 2024, the Senate Rules Committee passed three Klobuchar-led bipartisan bills to address the impact of AI on our elections. The bills include the Protect Elections from Deceptive AI Act with Senators Josh Hawley (R-MO), Chris Coons (D-DE), and Susan Collins (R-ME) joined by Senator Michael Bennet (D-CO), the AI Transparency in Elections Act with Senator Lisa Murkowski (R-AK), and the Preparing Election Administrators for AI Act with Senator Collins.

In January 2024, Klobuchar and Collins called on the Election Assistance Commission (EAC) to assist state and local election officials in combating the spread of AI-generated disinformation about our elections. Their letter followed the reports of AI-generated deepfake robocalls using President Biden’s voice to discourage voting in the New Hampshire primary election. In February, the EAC voted unanimously to allow election officials to use federal election funds to counter disinformation in our elections caused by AI.

Klobuchar is a lead sponsor of the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act of 2023. The NO FAKES Act is a bipartisan proposal that would protect the voice and visual likeness of all individuals from unauthorized recreations from generative AI.

The full text of the letter is available HERE

The UN provides recommendations for Governing AI for Humanity

The United Nations has released a report on Governing AI for Humanity. It’s daunting because the topic is daunting, but the report is easy to read. They start by recognizing that one of the tasks is getting everyone to the table. AI has the power to make our jobs easier and harder, to create and eliminate jobs, to bring us together and push us apart. It fascinating and it can feel like science fiction but it’s no longer on the doorstep, it’s in our living rooms. What we can do to prepare our communities is to learn as much as we can and start conversations with diverse communities. Holding classes on understanding or implementing AI might be just as popular in senior living spaces as the schools.

 

I’ve pulled out the UN’s recommendations to get folks thinking about AI and perhaps thinking about what needs to happen at the local level to help your community benefit from the powers of AI…

Recommendation 1 An international scientific panel on AI
We recommend the creation of an independent international scientific panel on AI, made up of diverse multidisciplinary experts in the field serving in their personal capacity on a voluntary basis.

Recommendation 2 Policy dialogue on AI governance
We recommend the launch of a twice-yearly intergovernmental and multi-stakeholder policy dialogue on AI governance on the margins of existing meetings at the United Nations.

Recommendation 3 AI standards exchange
We recommend the creation of an AI standards exchange, bringing together representatives from national and international standard-development organizations, technology companies, civil society and representatives from the international scientific panel.

Recommendation 4 Capacity development network
We recommend the creation of an AI capacity development network to link up a set of collaborating, United Nations-affiliated capacity development centres making available expertise, compute and AI training data to key actors.

Recommendation 5 Global fund for AI
We recommend the creation of a global fund for AI to put a floor under the AI divide.

Recommendation 6 Global AI data framework
We recommend the creation of a global AI data framework, developed through a process initiated by a relevant agency such as the United Nations Commission on International Trade Law and informed by the work of other international organizations

Recommendation 7 AI office within the Secretariat
We recommend the creation of an AI office within the Secretariat, reporting to the Secretary[1]General. It should be light and agile in organization, drawing, wherever possible, on relevant existing United Nations entities. Acting as the “glue” that supports and catalyzes the proposals in this report, partnering and interfacing with other processes and institutions

I also chose a graphic that piqued my interest…

Senator Klobuchar and others urge Justice Department and FTC to investigate generative AI potential antitrust violations

Senator Klobuchar and others ask the powers that be to look into generative AI…

U.S. Senator Amy Klobuchar (D-MN), Chairwoman of the Judiciary Subcommittee on Antitrust, Competition Policy and Consumer Rights, along with Senators Richard Blumenthal (D-CT), Mazie Hirono (D-HI), Dick Durbin (D-IL), Sheldon Whitehouse (D-RI), Tammy Duckworth (D-IL), Elizabeth Warren (D-MA), and Tina Smith (D-MN) sent a letter to Assistant Attorney General Jonathan Kanter and Federal Trade Commission (FTC) Chair Lina Khan to highlight the risks that new generative artificial intelligence (AI) features pose to competition and innovation in digital content, including journalism, and to urge both agencies to investigate whether the design of these features violates the antitrust laws.
“Recently, multiple dominant online platforms have introduced new generative AI features that answer user queries by summarizing, or, in some cases, merely regurgitating online content from other sources or platforms. The introduction of these new generative AI features further threatens the ability of journalists and other content creators to earn compensation for their vital work. While a traditional search result or news feed links may lead users to the publisher’s website, an AI-generated summary keeps the users on the original search platform, where that platform alone can profit from the user’s attention through advertising and data collection,” wrote the lawmakers. “Moreover, some generative AI features misappropriate third-party content and pass it off as novel content generated by the platform’s AI.”
“For the reasons outlined above, we urge the Department of Justice Antitrust Division and the Federal Trade Commission to investigate whether the design of some generative AI features, introduced by already dominant platforms, are a form of exclusionary conduct or an unfair method of competition in violation of the antitrust laws,” 
concluded the lawmakers.

Congresswoman Angie Craig working on bill to make selling illicit drugs on social media illegal

MinnPost tells the story of how one bereaved mom got Congresswoman Angie Craig interested and involved with getting rid of illicit drug sales on social media platforms…

Devin was Bridgette Norring’s middle child and oldest son. Beyond his interest in music, she said, “He was very athletic. He was into football, BMX, skateboarding. He was really good at math. He could rattle off the answers to the most complicated equations.” The Hastings teen also had a light-hearted side, she added: “He had a really quick-witted sense of humor. He was always trying to make someone laugh.”

On April 3, 2020, struggling with debilitating pain from blackout migraines and a cracked molar, Devin, desperate for relief after his doctor and dentist appointments had been canceled due to the pandemic, purchased a pill he thought was the pain reliever Percocet from a pair of local dealers who sold drugs on the social media platform Snapchat. It turned out that the pressed pill Devin took was not Percocet. It was actually 100% Fentanyl. He overdosed and was found dead in his bedroom the next morning. He was 19.

Norring connected with Craig…

Eventually, Craig invited Norring to join her Mental Health and Substance Use Disorder Advisory Council. “Of course I accepted,” Norring said. “It is a wonderful way to get our voices heard.”

This year, inspired by Devin’s story, Craig, with the support of Congresswoman Mariannette Miller-Meeks, R-Iowa, began work on the Cooper Davis and Devin Norring Acta bipartisan piece of legislation that would require social media companies and other communication service providers to alert federal law enforcement when illicit drug dealing and distribution occurs on their platforms. The act is named for Devin, and for Cooper Davis, a Shawnee, Kansas, teen who died of fentanyl poisoning after purchasing a counterfeit Percocet pill on Snapchat.

Craig said that, if passed, the legislation would hold social media companies accountable for the illegal sale of drugs on their platforms.

“Social media platforms are required to inform law enforcement in incidents of child sexual abuse on their platforms, but there are no consequences when they learn of these illicit drugs being sold on their platforms,” she said. “It’s surprising that there isn’t already the requirement.”

New law will give Minnesotans more privacy protection online

MinnPost reports

State Rep. Steve Elkins admits to some irritation when he would see a privacy notice pop up on his computer screen only to be told that his home state doesn’t have the same privacy rights as California, Virginia or Colorado.

Those states are part of a national trend toward explicitly requiring websites to give users enhanced privacy rights and tell them what those rights are.

After five years of trying, Elkins succeeded in May in adding Minnesota to that list with bill language that matches other states and in some instances exceeds them. Starting next July, Minnesota users can prevent personal data from being sold to data brokers, to block that data from being used to target advertising at them. In the case of sensitive personal data such as precise location and biometric data, users would have to give permission before it can be used.

More information on the law…

“Currently, tech companies can collect and sell data such as names, addresses, phone numbers, email addresses, payment information, social security numbers, and so much more,” Westlin said before the bill passed. “When Minnesotans engage with tech platforms, they deserve to know what data is being collected, where it is being stored, whether it is secure, and whether their data is being sold.

“The Minnesota Consumer Data Privacy Act gives Minnesotans rights over their data: the right to access the data, to correct the data … to delete their personal data, to obtain a copy of their data and to opt out of the sale of the data,” she said.

Fines can result from failing to comply.

 

Social Media and News: who uses which channels and why

Pew Research Center reports…

Social media platforms are an important part of the American news diet: Half of U.S. adults say they get news at least sometimes from social media in general. But specific platforms differ widely in structure, content and culture. A new Pew Research Center survey finds that the ways in which Americans encounter news on four major platforms – TikTok, X, Facebook and Instagram – vary widely.

Key findings from this study include:

  • Majorities of Facebook, Instagram and TikTok users say keeping up with news is not a reason they use the sites. X (formerly Twitter) is the exception to this pattern: Most X users say that keeping up with news is either a major or minor reason they use the platform, and about half say they regularly get news there.
  • Still, people are seeing news on all four platforms, especially through opinion- or humor-based content. Majorities of users on all four sites say they see people expressing opinions about current events and funny posts that reference current events. On the whole, more people see these types of posts than news articles or breaking news, although many also see that type of content (particularly on X and Facebook).
  • News on each platform comes from a variety of sources. Those who regularly get news on Facebook and Instagram are more likely than those on TikTok and X to get news from friends, family and acquaintances. More news consumers get news from influencers or other people they don’t know personally on TikTok than on other platforms. And news outlets or journalists are a more common source of news on X than on any other site.
  • Most news consumers on each of the platforms studied say they at least sometimes see news on the platform that seems inaccurate. This includes roughly a quarter or more on each site who say they extremely or fairly often see inaccurate news.
  • In general, Democrats tend to be more skeptical than Republicans of the news they see on X, while the reverse is true on Facebook. Among those who regularly get news on X, for example, 42% of Democrats and independents who lean toward the Democratic Party say they often see news there that seems inaccurate, compared with 31% of Republicans and GOP leaners.

 

MN Bill introduced HF5452: regulating social media for 15 and younger

The Introduction and First Reading of House Bills reports…

HF5452, A bill for an act relating to consumer protection; regulating the use of social media for minors ages 15 and younger; requiring anonymous age verification for websites harmful to minors; proposing coding for new law in Minnesota Statutes, chapter 325F.

The bill was read for the first time and referred to the Committee on Commerce Finance and Policy.

Here’s the bill as introduced…

A bill for an act
relating to consumer protection; regulating the use of social media for minors ages
15 and younger; requiring anonymous age verification for websites harmful to
minors; proposing coding for new law in Minnesota Statutes, chapter 325F.

BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF MINNESOTA:

Section 1. 

[325F.6945] SOCIAL MEDIA; USE BY MINORS.

Subdivision 1. 

Definitions.

(a) The following terms have the meanings given.

(b) “Account holder” means a resident who opens an account or creates a profile or is
identified by the social media platform by a unique identifier while using or accessing a
social media platform when the social media platform knows or has reason to believe the
resident is located in this state.

(c) “Daily active users” means the number of unique users in the United States who used
the online forum, website, or application at least 80 percent of the days during the previous
12 months, or if the online forum, website, or application did not exist during the previous
12 months, the number of unique users in the United States who used the online forum,
website, or application at least 80 percent of the days during the previous month.

(d) “Resident” means a person who lives in this state for more than six months of the
year.

(e) “Social media platform” means an online forum, website, or application that satisfies
each of the following criteria:

(1) allows users to upload content or view the content or activity of other users;

(2) ten percent or more of the daily active users who are younger than 16 years of age
spend on average two hours per day or longer on the online forum, website, or application
on the days when using the online forum, website, or application during the previous 12
months, or if the online forum, website, or application did not exist during the previous 12
months, during the previous month;

(3) employs algorithms that analyze user data or information on users to select content
for users; and

(4) has any of the following addictive features:

(i) infinite scrolling, which means either continuously loading content or content that
loads as the user scrolls down the page without the need to open a separate page, or seamless
content or the use of pages with no visible or apparent end or page breaks;

(ii) push notifications or alerts sent by the online forum, website, or application to inform
a user about specific activities or events related to the user’s account;

(iii) displays personal interactive metrics that indicate the number of times other users
have clicked a button to indicate their reaction to content or have shared or reposted the
content;

(iv) autoplay video or video that begins to play without the user first clicking on the
video or on a play button for that video; or

(v) live-streaming or a function that allows a user or advertiser to broadcast live video
content in real-time.

The term “social media platform” does not include an online service, website, or application
where the exclusive function is email or direct messaging consisting of text, photographs,
pictures, images, or videos shared only between the sender and the recipients, without
displaying or posting publicly or to other users not specifically identified as the recipients
by the sender.

Subd. 2. 

Requirements; minors younger than 14 years of age and social media.

(a)
A social media platform shall prohibit a minor who is younger than 14 years of age from
entering into a contract with a social media platform to become an account holder. A social
media company must terminate any account held by an account holder younger than 14
years of age, including accounts that the social media platform treats or categorizes as
belonging to an account holder who is likely younger than 14 years of age for purposes of
targeting content or advertising, and provide 90 days for an account holder to dispute the
termination. Termination must be effective upon the expiration of the 90 days if the account
holder fails to effectively dispute the termination.

(b) A social media platform must allow an account holder younger than 14 years of age
to request to terminate the account, and termination must be effective within five business
days after the request. A social media company must also allow the confirmed parent or
guardian of an account holder younger than 14 years of age to request that the minor’s
account be terminated and termination must be effective within ten business days after the
request.

(c) The social media platform must permanently delete all personal information held by
the social media platform relating to the terminated account, unless there are legal
requirements to maintain the information.

Subd. 3. 

Requirements; minors 14 and 15 years of age.

(a) A social media platform
shall prohibit a minor who is 14 or 15 years of age from entering into a contract with a social
media platform to become an account holder, unless the minor’s parent or guardian provides
consent for the minor to become an account holder. A social media platform must terminate
any account held by an account holder who is 14 or 15 years of age, including accounts that
the social media platform treats or categorizes as belonging to an account holder who is
likely 14 or 15 years of age for purposes of targeting content or advertising, if the account
holder’s parent or guardian has not provided consent for the minor to create or maintain the
account. The social media platform must provide 90 days for an account holder to dispute
the termination. Termination must be effective upon the expiration of the 90 days if the
account holder fails to effectively dispute the termination.

(b) A social media platform must allow an account holder who is 14 or 15 years of age
to request to terminate the account, and termination must be effective within five business
days after the request. A social media platform must allow the confirmed parent or guardian
of an account holder who is 14 or 15 years of age to request that the minor’s account be
terminated, and termination must be effective within ten business days after the request.

(c) A social media platform must permanently delete all personal information held by
the social media platform relating to the terminated account, unless there are legal
requirements to maintain the information.

Subd. 4. 

Enforcement; penalties.

(a) Any knowing or reckless violation of this section
is deemed an unfair and deceptive trade practice actionable under this chapter by the attorney
general and the attorney general may bring an action against a social media platform for an
unfair or deceptive act or practice. In addition to other remedies available under section
8.31, the attorney general may collect a civil penalty of up to $50,000 per violation and
reasonable attorney fees and court costs. When the social media platform’s failure to comply
with this section is a consistent pattern of knowing or reckless conduct, punitive damages
may be assessed against the social media platform consistent with section 549.20.

(b) If, by its own inquiry or as a result of complaints, the attorney general has reason to
believe that an entity or person has engaged in, or is engaging in, an act or practice that
violates this section, the attorney general my investigate using all available remedies under
the law.

Subd. 5. 

Enforcement; damages to minor account holder.

A social media platform
that knowingly or recklessly violates this section is liable to the minor account holder,
including court costs and reasonable attorney fees as ordered by the court. Claimants may
be awarded up to $10,000 in damages. A civil action for a claim under this subdivision must
be brought within one year from the date the complainant knew, or reasonably should have
known, of the alleged violation. An action brought under this subdivision may only be
brought on behalf of a minor account holder.

Subd. 6. 

Jurisdiction; social media platform contracts.

(a) For purposes of bringing
an action under this section, a social media platform that allows a minor account holder
younger than 14 years of age or a minor account holder who is 14 or 15 years of age to
create an account on the platform is considered to be both engaged in substantial activities
within this state and operating, conducting, engaging in, or carrying on a business and doing
business in this state, and is subject to the jurisdiction of the courts of this state.

(b) For the purposes of this section, when a social media platform allows an account
holder to use the social media platform, the account holder, regardless of age, and the social
media platform have entered into a contract.

Subd. 7. 

Other available remedies.

This section does not preclude any other available
remedy at law or equity.

EFFECTIVE DATE.

This section is effective August 1, 2024, and applies to causes
of action accruing on or after that date.

Sec. 2. 

[325F.6946] SOCIAL MEDIA; AGE VERIFICATION.

Subdivision 1. 

Definitions.

(a) The following terms have the meanings given.

(b) “Anonymous age verification” means a commercially reasonable method used by a
government agency or a business for the purpose of age verification which is conducted by
a nongovernmental, independent third party organization that is located in the United States,
and not controlled by a foreign country, the government of a foreign country, or any other
entity formed in a foreign country.

(c) “Commercial entity” includes a corporation, a limited liability company, a partnership,
a limited partnership, a sole proprietorship, and any other legally recognized entity.

(d) “Disseminates” has the meaning given in section 604.30 for dissemination.

(e) “Material harmful to minors” means any material that the average person applying
contemporary community standards would find, taken as a whole, appeals to the prurient
interest and depicts or describes, in a patently offensive way, sexual conduct that when
taken as a whole, lacks serious literary, artistic, political, or scientific value for minors.

(f) “News-gathering organization” means any newspaper, news publication, or news
source printed or published online or on a mobile platform that reports current news and
matters of public interest. News-gathering organization includes but is not limited to a radio
broadcast station, television broadcast station, and cable television operator.

(g) “Publish” means to communicate or make information available to another person
or entity on a publicly available website or application.

(h) “Resident” means a person who lives in this state for more than six months of the
year.

(i) “Substantial portion” means more than 33.3 percent of total material on a website or
application.

Subd. 2. 

Publishing material harmful to minors; age verification requirements.

(a)
A commercial entity that knowingly and intentionally publishes or disseminates material
harmful to minors on a website or application, if the website or application contains a
substantial portion of material harmful to minors, must use anonymous age verification to
verify that the age of a person attempting to access the material is 18 years of age or older
and prevent access to the material by a person younger than 18 years of age.

(b) A commercial entity must ensure that the requirements of subdivision 7 are met.

Subd. 3. 

Exceptions for news and Internet service providers.

(a) This section does
not apply to any bona fide news or public interest broadcast, website video, or report and
does not affect the rights of a news-gathering organization.

(b) An Internet service provider or its affiliates or subsidiaries, a search engine, or a
cloud service provider does not violate this section solely for providing access or connection
to or from a website or other information or content on the Internet or a facility, system, or
network not under the provider’s control, including transmission, downloading, intermediate
storage, or access software, to the extent the provider is not responsible for the creation of
the content of the communication which constitutes material harmful to minors.

Subd. 4. 

Remedies; attorney general enforcement.

(a) A violation of subdivision 2 is
deemed an unfair and deceptive trade practice actionable under this chapter, and an action
by the attorney general may be brought on behalf of a resident minor against a commercial
entity. If the attorney general has reason to believe that a commercial entity is in violation
of this section, the attorney general may bring an action against the commercial entity for
an unfair or deceptive act or practice. In addition to any other remedy available, the attorney
general may collect a civil penalty of up to $50,000 per violation and reasonable attorney
fees and court costs. When the commercial entity’s failure to comply with this section is a
consistent pattern of conduct of the commercial entity, punitive damages may be assessed
against the commercial entity consistent with section 549.20.

(b) A third party that performs age verification for a commercial entity in violation of
this section is deemed to have committed an unfair and deceptive trade practice actionable
under this chapter, and the attorney general as the enforcing authority, may bring an action
against the third party for an unfair or deceptive act or practice. In addition to other remedies
available, the attorney general may collect a civil penalty of up to $50,000 per violation and
reasonable attorney fees and court costs.

Subd. 5. 

Remedies for minors.

A commercial entity that violates subdivision 2 for
failing to prohibit access or prohibit a minor from future access to material harmful to minors
after a report of unauthorized or unlawful access is liable to the minor for the access,
including court costs and reasonable attorney fees as ordered by the court. Claimants may
be awarded up to $10,000 in damages. A civil action for a claim under this paragraph must
be brought within one year from the date the complainant knew, or reasonably should have
known, of the alleged violation. An action under this subdivision may only be brought on
behalf of or by a resident minor. For purposes of bringing an action under this subdivision,
a commercial entity that publishes or disseminates material harmful to minors on a website
or application, if the website or application contains a substantial portion of material harmful
to minors and the website or application is available to be accessed in this state, is considered
to be both engaged in substantial and not isolated activities within this state and operating,
conducting, engaging in, or carrying on a business and doing business in this state, and is
subject to the jurisdiction of the courts of this state.

Subd. 6. 

Other available remedies.

This section does not preclude any other available
remedy at law or equity.

Subd. 7. 

Anonymous age verification.

A third party conducting anonymous age
verification pursuant to this section:

(1) may not retain personal identifying information used to verify age once the age of
an account holder or a person seeking an account has been verified;

(2) may not use personal identifying information used to verify age for any other purpose;

(3) must keep anonymous any personal identifying information used to verify age, and
the information may not be shared or otherwise communicated to any person; and

(4) must protect personal identifying information used to verify age from unauthorized
or illegal access, destruction, use, modification, or disclosure through reasonable security
procedures and practices appropriate to the nature of the personal information.

EFFECTIVE DATE.

This section is effective August 1, 2024, and applies to causes
of action accruing on or after that date.

EVENTS: Tech classes for nonprofits from MN Council of Nonprofits

I thought the following classes from MN Council of Nonprofits might be of interest to some readers…

Tech Tools and Techniques to Boost Efficiency
Wednesday, May 22  |  1pm – 3pm  |  Virtual
Discover concrete ways to take advantage of technology to complete tasks, get work done, and make sure nothing falls through the cracks, from your inbox to workflow automation.
Register

Intentionally Building Community Across Cultures
Wednesday, May 22  |  9am – 11am  |  Virtual
Explore the ways biases have interfered in building authentic relationships with colleagues and community members.
Register

Spotlight: Cybersecurity for Nonprofits
Friday, June 21  |  9am – 3:30pm  |  Virtual
In this day-long training, learn need-to-know trends in cybersecurity that will help prepare you for a safe and secure future including AI, ransomware, and cyber insurance.
Register