MN Bill introduced HF5452: regulating social media for 15 and younger

The Introduction and First Reading of House Bills reports…

HF5452, A bill for an act relating to consumer protection; regulating the use of social media for minors ages 15 and younger; requiring anonymous age verification for websites harmful to minors; proposing coding for new law in Minnesota Statutes, chapter 325F.

The bill was read for the first time and referred to the Committee on Commerce Finance and Policy.

Here’s the bill as introduced…

A bill for an act
relating to consumer protection; regulating the use of social media for minors ages
15 and younger; requiring anonymous age verification for websites harmful to
minors; proposing coding for new law in Minnesota Statutes, chapter 325F.

BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF MINNESOTA:

Section 1. 

[325F.6945] SOCIAL MEDIA; USE BY MINORS.

Subdivision 1. 

Definitions.

(a) The following terms have the meanings given.

(b) “Account holder” means a resident who opens an account or creates a profile or is
identified by the social media platform by a unique identifier while using or accessing a
social media platform when the social media platform knows or has reason to believe the
resident is located in this state.

(c) “Daily active users” means the number of unique users in the United States who used
the online forum, website, or application at least 80 percent of the days during the previous
12 months, or if the online forum, website, or application did not exist during the previous
12 months, the number of unique users in the United States who used the online forum,
website, or application at least 80 percent of the days during the previous month.

(d) “Resident” means a person who lives in this state for more than six months of the
year.

(e) “Social media platform” means an online forum, website, or application that satisfies
each of the following criteria:

(1) allows users to upload content or view the content or activity of other users;

(2) ten percent or more of the daily active users who are younger than 16 years of age
spend on average two hours per day or longer on the online forum, website, or application
on the days when using the online forum, website, or application during the previous 12
months, or if the online forum, website, or application did not exist during the previous 12
months, during the previous month;

(3) employs algorithms that analyze user data or information on users to select content
for users; and

(4) has any of the following addictive features:

(i) infinite scrolling, which means either continuously loading content or content that
loads as the user scrolls down the page without the need to open a separate page, or seamless
content or the use of pages with no visible or apparent end or page breaks;

(ii) push notifications or alerts sent by the online forum, website, or application to inform
a user about specific activities or events related to the user’s account;

(iii) displays personal interactive metrics that indicate the number of times other users
have clicked a button to indicate their reaction to content or have shared or reposted the
content;

(iv) autoplay video or video that begins to play without the user first clicking on the
video or on a play button for that video; or

(v) live-streaming or a function that allows a user or advertiser to broadcast live video
content in real-time.

The term “social media platform” does not include an online service, website, or application
where the exclusive function is email or direct messaging consisting of text, photographs,
pictures, images, or videos shared only between the sender and the recipients, without
displaying or posting publicly or to other users not specifically identified as the recipients
by the sender.

Subd. 2. 

Requirements; minors younger than 14 years of age and social media.

(a)
A social media platform shall prohibit a minor who is younger than 14 years of age from
entering into a contract with a social media platform to become an account holder. A social
media company must terminate any account held by an account holder younger than 14
years of age, including accounts that the social media platform treats or categorizes as
belonging to an account holder who is likely younger than 14 years of age for purposes of
targeting content or advertising, and provide 90 days for an account holder to dispute the
termination. Termination must be effective upon the expiration of the 90 days if the account
holder fails to effectively dispute the termination.

(b) A social media platform must allow an account holder younger than 14 years of age
to request to terminate the account, and termination must be effective within five business
days after the request. A social media company must also allow the confirmed parent or
guardian of an account holder younger than 14 years of age to request that the minor’s
account be terminated and termination must be effective within ten business days after the
request.

(c) The social media platform must permanently delete all personal information held by
the social media platform relating to the terminated account, unless there are legal
requirements to maintain the information.

Subd. 3. 

Requirements; minors 14 and 15 years of age.

(a) A social media platform
shall prohibit a minor who is 14 or 15 years of age from entering into a contract with a social
media platform to become an account holder, unless the minor’s parent or guardian provides
consent for the minor to become an account holder. A social media platform must terminate
any account held by an account holder who is 14 or 15 years of age, including accounts that
the social media platform treats or categorizes as belonging to an account holder who is
likely 14 or 15 years of age for purposes of targeting content or advertising, if the account
holder’s parent or guardian has not provided consent for the minor to create or maintain the
account. The social media platform must provide 90 days for an account holder to dispute
the termination. Termination must be effective upon the expiration of the 90 days if the
account holder fails to effectively dispute the termination.

(b) A social media platform must allow an account holder who is 14 or 15 years of age
to request to terminate the account, and termination must be effective within five business
days after the request. A social media platform must allow the confirmed parent or guardian
of an account holder who is 14 or 15 years of age to request that the minor’s account be
terminated, and termination must be effective within ten business days after the request.

(c) A social media platform must permanently delete all personal information held by
the social media platform relating to the terminated account, unless there are legal
requirements to maintain the information.

Subd. 4. 

Enforcement; penalties.

(a) Any knowing or reckless violation of this section
is deemed an unfair and deceptive trade practice actionable under this chapter by the attorney
general and the attorney general may bring an action against a social media platform for an
unfair or deceptive act or practice. In addition to other remedies available under section
8.31, the attorney general may collect a civil penalty of up to $50,000 per violation and
reasonable attorney fees and court costs. When the social media platform’s failure to comply
with this section is a consistent pattern of knowing or reckless conduct, punitive damages
may be assessed against the social media platform consistent with section 549.20.

(b) If, by its own inquiry or as a result of complaints, the attorney general has reason to
believe that an entity or person has engaged in, or is engaging in, an act or practice that
violates this section, the attorney general my investigate using all available remedies under
the law.

Subd. 5. 

Enforcement; damages to minor account holder.

A social media platform
that knowingly or recklessly violates this section is liable to the minor account holder,
including court costs and reasonable attorney fees as ordered by the court. Claimants may
be awarded up to $10,000 in damages. A civil action for a claim under this subdivision must
be brought within one year from the date the complainant knew, or reasonably should have
known, of the alleged violation. An action brought under this subdivision may only be
brought on behalf of a minor account holder.

Subd. 6. 

Jurisdiction; social media platform contracts.

(a) For purposes of bringing
an action under this section, a social media platform that allows a minor account holder
younger than 14 years of age or a minor account holder who is 14 or 15 years of age to
create an account on the platform is considered to be both engaged in substantial activities
within this state and operating, conducting, engaging in, or carrying on a business and doing
business in this state, and is subject to the jurisdiction of the courts of this state.

(b) For the purposes of this section, when a social media platform allows an account
holder to use the social media platform, the account holder, regardless of age, and the social
media platform have entered into a contract.

Subd. 7. 

Other available remedies.

This section does not preclude any other available
remedy at law or equity.

EFFECTIVE DATE.

This section is effective August 1, 2024, and applies to causes
of action accruing on or after that date.

Sec. 2. 

[325F.6946] SOCIAL MEDIA; AGE VERIFICATION.

Subdivision 1. 

Definitions.

(a) The following terms have the meanings given.

(b) “Anonymous age verification” means a commercially reasonable method used by a
government agency or a business for the purpose of age verification which is conducted by
a nongovernmental, independent third party organization that is located in the United States,
and not controlled by a foreign country, the government of a foreign country, or any other
entity formed in a foreign country.

(c) “Commercial entity” includes a corporation, a limited liability company, a partnership,
a limited partnership, a sole proprietorship, and any other legally recognized entity.

(d) “Disseminates” has the meaning given in section 604.30 for dissemination.

(e) “Material harmful to minors” means any material that the average person applying
contemporary community standards would find, taken as a whole, appeals to the prurient
interest and depicts or describes, in a patently offensive way, sexual conduct that when
taken as a whole, lacks serious literary, artistic, political, or scientific value for minors.

(f) “News-gathering organization” means any newspaper, news publication, or news
source printed or published online or on a mobile platform that reports current news and
matters of public interest. News-gathering organization includes but is not limited to a radio
broadcast station, television broadcast station, and cable television operator.

(g) “Publish” means to communicate or make information available to another person
or entity on a publicly available website or application.

(h) “Resident” means a person who lives in this state for more than six months of the
year.

(i) “Substantial portion” means more than 33.3 percent of total material on a website or
application.

Subd. 2. 

Publishing material harmful to minors; age verification requirements.

(a)
A commercial entity that knowingly and intentionally publishes or disseminates material
harmful to minors on a website or application, if the website or application contains a
substantial portion of material harmful to minors, must use anonymous age verification to
verify that the age of a person attempting to access the material is 18 years of age or older
and prevent access to the material by a person younger than 18 years of age.

(b) A commercial entity must ensure that the requirements of subdivision 7 are met.

Subd. 3. 

Exceptions for news and Internet service providers.

(a) This section does
not apply to any bona fide news or public interest broadcast, website video, or report and
does not affect the rights of a news-gathering organization.

(b) An Internet service provider or its affiliates or subsidiaries, a search engine, or a
cloud service provider does not violate this section solely for providing access or connection
to or from a website or other information or content on the Internet or a facility, system, or
network not under the provider’s control, including transmission, downloading, intermediate
storage, or access software, to the extent the provider is not responsible for the creation of
the content of the communication which constitutes material harmful to minors.

Subd. 4. 

Remedies; attorney general enforcement.

(a) A violation of subdivision 2 is
deemed an unfair and deceptive trade practice actionable under this chapter, and an action
by the attorney general may be brought on behalf of a resident minor against a commercial
entity. If the attorney general has reason to believe that a commercial entity is in violation
of this section, the attorney general may bring an action against the commercial entity for
an unfair or deceptive act or practice. In addition to any other remedy available, the attorney
general may collect a civil penalty of up to $50,000 per violation and reasonable attorney
fees and court costs. When the commercial entity’s failure to comply with this section is a
consistent pattern of conduct of the commercial entity, punitive damages may be assessed
against the commercial entity consistent with section 549.20.

(b) A third party that performs age verification for a commercial entity in violation of
this section is deemed to have committed an unfair and deceptive trade practice actionable
under this chapter, and the attorney general as the enforcing authority, may bring an action
against the third party for an unfair or deceptive act or practice. In addition to other remedies
available, the attorney general may collect a civil penalty of up to $50,000 per violation and
reasonable attorney fees and court costs.

Subd. 5. 

Remedies for minors.

A commercial entity that violates subdivision 2 for
failing to prohibit access or prohibit a minor from future access to material harmful to minors
after a report of unauthorized or unlawful access is liable to the minor for the access,
including court costs and reasonable attorney fees as ordered by the court. Claimants may
be awarded up to $10,000 in damages. A civil action for a claim under this paragraph must
be brought within one year from the date the complainant knew, or reasonably should have
known, of the alleged violation. An action under this subdivision may only be brought on
behalf of or by a resident minor. For purposes of bringing an action under this subdivision,
a commercial entity that publishes or disseminates material harmful to minors on a website
or application, if the website or application contains a substantial portion of material harmful
to minors and the website or application is available to be accessed in this state, is considered
to be both engaged in substantial and not isolated activities within this state and operating,
conducting, engaging in, or carrying on a business and doing business in this state, and is
subject to the jurisdiction of the courts of this state.

Subd. 6. 

Other available remedies.

This section does not preclude any other available
remedy at law or equity.

Subd. 7. 

Anonymous age verification.

A third party conducting anonymous age
verification pursuant to this section:

(1) may not retain personal identifying information used to verify age once the age of
an account holder or a person seeking an account has been verified;

(2) may not use personal identifying information used to verify age for any other purpose;

(3) must keep anonymous any personal identifying information used to verify age, and
the information may not be shared or otherwise communicated to any person; and

(4) must protect personal identifying information used to verify age from unauthorized
or illegal access, destruction, use, modification, or disclosure through reasonable security
procedures and practices appropriate to the nature of the personal information.

EFFECTIVE DATE.

This section is effective August 1, 2024, and applies to causes
of action accruing on or after that date.

EVENTS: Tech classes for nonprofits from MN Council of Nonprofits

I thought the following classes from MN Council of Nonprofits might be of interest to some readers…

Tech Tools and Techniques to Boost Efficiency
Wednesday, May 22  |  1pm – 3pm  |  Virtual
Discover concrete ways to take advantage of technology to complete tasks, get work done, and make sure nothing falls through the cracks, from your inbox to workflow automation.
Register

Intentionally Building Community Across Cultures
Wednesday, May 22  |  9am – 11am  |  Virtual
Explore the ways biases have interfered in building authentic relationships with colleagues and community members.
Register

Spotlight: Cybersecurity for Nonprofits
Friday, June 21  |  9am – 3:30pm  |  Virtual
In this day-long training, learn need-to-know trends in cybersecurity that will help prepare you for a safe and secure future including AI, ransomware, and cyber insurance.
Register

A look at the landscape of online content, AI, usage and regulations

Project Liberty is an international impact organization founded by Frank McCourt to build a new civic architecture for a safer, healthier internet. They have a recent article on regulating technology in this new world of AI and concerns about social media. The topics have never seen less futuristic than now as I track bills in the Minnesota Legislature (MN HF3488 compensation for kids’ content online and MN4400: Prohibiting Social Media Manipulation Act ) and watch the federal courts (Supreme Court looks at online speech). I thought I’d share some of Project Liberty’s recent newsletter

Growing concern

From the discrimination of algorithms, to AI-generated deepfakes and disinformation, to content on social media platforms that’s harmful for children’s mental health, there is a growing consciousness worldwide about the problems caused by today’s technology.

  • Last week, a report issued by the US State Department was published, with the conclusion that the most advanced AI systems could “pose an extinction-level threat to the human species.”
  • Last month, Project Liberty Foundation released research finding that the majority of adults globally believe that social media companies bear “a great deal” of responsibility for making the internet safe.
  • Last year, a poll done by Project Liberty Alliance member Issue One found that 87% of the US electorate want government action to combat the harms being caused by social media platforms.

Tech is fast, passing laws is slow

Lawmakers are beginning to take action.

While the US lags behind Europe in comprehensive regulations around tech (the EU has been the world’s leader in passing laws to regulate big tech for years), Europe’s speed in passing laws has not translated into ease of enforcing them.

The era of DIY regulation

In the absence of comprehensive laws and sound enforcement, there’s a patchwork of solutions emerging at every level.

  • States: Filling the void left by inaction at the US federal level, US states are taking action. Nearly 200 bills were introduced in local state legislatures in 2023 aimed at regulating AI (only 12 of which became law), and this year states across the US will debate over 400 AI-related bills. To limit the harms caused by social media, US states have taken a variety of approaches, leading to a lack of consistency and a patchwork of directives, according to a report by Brookings last year.

MN4400: Prohibiting Social Media Manipulation Act placed on General Register

The House Judiciary Finance and Civil Law Committee heard HF4400 (Stephenson) Prohibiting Social Media Manipulation Act created, social media platforms regulated, and private right of action and attorney general enforcement provided.

Based on a report written for Legislature last year. The intent is particularly protect children. It places an assumption of privacy, better testing for bots or other potential bad actors, ability to break the algorithm based on clicks and no more testing on kids without permission.

Pro Comments

  • Social media is dangerous.
  • We can regulate safety of experience – just like we can in bookstores
  • We should not be beholden to social media companies. They sell our information

Con Comments

  • Let’s not have a private right of action; let’s talk to vendors. (A2 Amendment.)
  • It is challenging to do with and not cross the constitution
  • How can we handle border counties?
  • What’s the definition of social media? Dowe include educational platforms?

Placed on General Register.

EVENT Mar 14: MN House committee will discuss HF4400: the Prohibiting Social Media Manipulation Act

Happening tomorrow at 8:30 – and livestreamed for our convenience…

Date: Thursday, March 14

Time: 8:30 a.m.

Event: House Judiciary Finance and Civil Law Committee

Agenda:

HF3812 (Acomb) Immunity provided for individuals assisting another to seek medical assistance for drug-related overdose.

HF4048 (Tabke) Department of Corrections; various provisions modified relating to data sharing, correctional officer use of deadly force, electronic filing of detainer, disclosure to victims of city and zip codes of offender after incarceration, disqualifying medical conditions, health care peer review committee, jail inspection data, medical director designee, Supervised Release Board, probation report date, and comprehensive community supervision and probation services.

HF4366 (Edelson) Civil commitment priority admission requirements modified, prisoner in a correctional facility specified to not be responsible for co-payments for mental health medications, county co-payment expense reimbursement allowed, and money appropriated.

HF4210 (Reyer) Hospital behavioral health crisis intervention team requirements established, behavioral health crisis intervention grant program for hospitals established, provisions preventing violence against health care workers modified, public disclosure of emergency department wait times required, and money appropriated.

HF4400 (Stephenson) Prohibiting Social Media Manipulation Act created, social media platforms regulated, and private right of action and attorney general enforcement provided.

HF2319 (Hollins) Admission in judicial proceeding of custodial statements prohibited.

Channel: HTV1

AI and tax preparations: a teachable (information literacy) moment for all of us

The Washington Post ran an interesting article on AI (artificial intelligence) and tax preparation. It’s interesting reading for folks who are worried or curious about AI and its place in our world. It shows that AI isn’t going to be running the world tomorrow and that we all need to beware when we’re getting advice online. Just because you read it online or hear it from a chat-bot, doesn’t mean it’s true. They offer some good examples…

This year, TurboTax and H&R Block added artificial intelligence to the tax-prep software used by millions of us. Now while you’re doing your taxes online, there are AI chatbots on the right side of the screen to answer your burning questions.

But watch out: Rely on either AI for even lightly challenging tax questions, and you could end up confused. Or maybe even audited.

Here’s one example: Where should your child file taxes if she goes to college out of state? When I asked, TurboTax’s “Intuit Assist” bot offered irrelevant advice about tax credits and extensions. H&R Block’s “AI Tax Assist” bot gave me the wrong impression she has to file in both places. (The correct answer: She only files in the other state if she has earned income there.)

Question after question, I got many of the same random, misleading or inaccurate AI answers.

What’s going on? We’ve all heard about the incredible possibilities of generative AI. But now we have to wade through a parade of terrible AI products, as companies stuff still-experimental AI into everyday things. For consumers, it’s on us to figure out how to size up each new AI we encounter. (Come across an AI that needs some investigation? Send me an email.)

MN House Committee meets about HF4400: Prohibiting Social Media Manipulation Act

The  Minnesota House Commerce Finance and Policy committee talked about HF4400 (Stephenson); Prohibiting Social Media Manipulation Act created, social media platforms regulated, and private right of action and attorney general enforcement provided. (I will replace livestream with archived video when available.)

They heard a report from Attorney General on kids and social media

They heard from a tech association that opposed the bill. They think it will decrease online experience for Minnesotans. They think it might overburden the system and that it might pave the road for racism and other biased views. They also mention free speech and access to information. This will lose in court on first amendment grounds.

Questions:

We know this is an issue. Legal aspects of regulation are a concern. Three areas of concern:
1. Constitutional issues – cannot regulate speech. (US Supreme Court is looking at this)
2. Workability
3. Private right of action is not the way to go.
We look forward to the conversation. Your issues are important. I have mixed feelings on private right of action too.

What about hacking of preferences?
The bill says content has to be based on user preference over engagement.

Can we make the interface accessible? What happens if the user doesn’t set preference?
This bill deals with conflict between stated preference and user engagement.

Is this exclusive to a social media platform? Or can cookies be set up to alter browsing based on stated preferences?
Right now this is for social media but it seems like there could be some ties.

Can we get bipartisan support?
OK.

Bill is rereferred to Judiciary, Finance and Civil Law Committee.

Is the Internet a Pandora’s Box? How can we harness it?

Reed Anfinson, Co-Publisher of the Stevens County Times published an editorial about the “internet dark side.” He starts by recognizing the value of the Internet for education and economic development but focuses on unsavory activities surrounding the Internet…

As federal, state, and local leaders consider the benefits of broadband access, they must also consider its dark side.

It exposes our youth to online sexual predators. It is responsible for online bullying and isolation and is linked to higher rates of teen suicides.

Amazon preys on the dollars available in our communities to support local businesses as people order everyday basics online.

Facebook, Google, and the corporations they own decimate community newspapers while capitalizing on our reporting to earn money. We’ve lost nearly 2,900 newspapers since 2005, and we are losing additional newspapers at two per week. Around 1,800 communities and more than 200 counties no longer have a newspaper.

It is valuable to explore all sides of an issue. Twenty years ago, I traveled around Minnesota to show people the Internet, so this is a perspective I’ve heard before. In some ways those concerned citizens were right about how dark things could be.  Clearly, I’m unabashedly a booster for broadband, so I have my perspective too but still wanted to tackle the topic. And my bias may show too.

Anfinson focuses on how the internet has left local newspapers in the lurch, which I’m sure he feels acutely as a publisher…

Facebook, Google, and the corporations they own decimate community newspapers while capitalizing on our reporting to earn money. We’ve lost nearly 2,900 newspapers since 2005, and we are losing additional newspapers at two per week. Around 1,800 communities and more than 200 counties no longer have a newspaper.

When newspapers disappear, fake news websites and print products show up pretending to be professional news but, in fact, are backed by political extremes or foreign countries.

Communities without local news aren’t informed about the challenges their communities face. Without that knowledge, infrastructure, quality of life, and community loyalty deteriorate.

Voting rates drop, fewer races are contested, people are less informed about election issues, and may not know who is running for office. People know little about the records of those serving them.

I agree that the death of the local paper has created information deserts; I also recognize that the cow is out of the barn and she’s not going back. So, what can we do to prepare our communities given the current situation. Is there way to embrace the old and the new? Here are some thoughts:

  • Is there an opportunity to get public funding for local newspapers? Right now, the MN Legislature is looking at a bill that would get funding from paid services to support public access programming. Is there an appetite for something similar for news?
  • Is there a hybrid model for an online newspaper that works with citizen journalists to encourage more content and diverse voices? I was on a board of Twin Cities Daily Planet, which trained and published citizen journalists. (Sustainable funding was an issue.) It was a great way to get New Americans, older residents, subject specialists to share their stories and for people to learn about a wide range of local events, issues and community members.
  • Information literacy training! This goes back to my days as a librarian. I worked at a university and every freshman sat through at least one classroom of training on information literacy. We looked at how to find an author or corporate sponsor of information, the date of publication, and asked if the resource well respected. We talked about bias and motive of the information producer and more. We need more training like this – and I’d extend that to include things like how to tell if the Facebook friend request is real, how can you tell if a website is secure enough for a financial transaction. The Supreme Court is currently looking at the future of online speech; regardless of what they decide, the power to assess the quality of information is valuable.
  • Is there an opportunity to partner with the local government? Since the pandemic, many, if not most, local governments livestream public meetings, which makes it much easier for people to attend and engage. No more taking the day off of work to attend a session, no more travel. Providing easy access to those meetings with a public calendar (that maybe also ties into the high school sports schedule) and maybe hosting a conversation during or after the meeting might encourage even greater participation.

Some of these ideas have been done in local communities; several projects were funded by Blandin Foundation throughout the years as a way to build value in the network. Again, I think the key is not one or the other – but yes and. Yes ,to local news and yes, to broadband.

EVENT Mar 4: MN House committee will discuss HF4400: the Prohibiting Social Media Manipulation Act

The Minnesota House Commerce Finance and Policy committee will be discussing HF4400 (Stephenson); Prohibiting Social Media Manipulation Act created, social media platforms regulated, and private right of action and attorney general enforcement provided.

Here’s the agenda:

Chair: Rep. Zack Stephenson

Monday, March 4, 2024, 1:00 PM

Room 10, State Office Building

AGENDA

Call to Order

Approval of the Minutes – February 28, 2024

HF4100 (Reyer); Debt collection, garnishment, and consumer finance governing provisions modified; debtor protections provided; and statutory forms review required.

HF4400 (Stephenson); Prohibiting Social Media Manipulation Act created, social media platforms regulated, and private right of action and attorney general enforcement provided.

HF4388 (Urdahl); Litchfield allowed to issue on-sale liquor license for town ball games played at ballpark on school grounds.

HF4040 (Kotyza-Witthuhn); Minnesota Securities Act registration provisions modified, and franchise fee deferral modified.

HF4041 (Kotyza-Witthuhn); Financial institution various governing provisions added and modified, and technical changes made.

Adjournment

And the bill as introduced:

A bill for an act
relating to consumer protection; creating the Prohibiting Social Media Manipulation
Act; regulating social media platforms; providing a private right of action and
attorney general enforcement; proposing coding for new law as Minnesota Statutes,
chapter 325O.

BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF MINNESOTA:

Section 1.

Continue reading

Google Cache no longer caching websites but Wayback Machine does

Ars Technica reports

Google will no longer be keeping a backup of the entire Internet. Google Search’s “cached” links have long been an alternative way to load a website that was down or had changed, but now the company is killing them off. Google “Search Liaison” Danny Sullivan confirmed the feature removal in an X post, saying the feature “was meant for helping people access pages when way back, you often couldn’t depend on a page loading. These days, things have greatly improved. So, it was decided to retire it.”

The feature has been appearing and disappearing for some people since December, and currently, we don’t see any cache links in Google Search. For now, you can still build your own cache links even without the button, just by going to https://webcache.googleusercontent.com/search?q=cache: plus a website URL, or by typing “cache:” plus a URL into Google Search. For now, the cached version of Ars Technica seems to still work. All of Google’s support pages about cached sites have been taken down.

Luckily, the Internet Archive Wayback Machine is still available. Here’s a description of the service from their webpage…

The Internet Archive, a 501(c)(3) non-profit, is building a digital library of Internet sites and other cultural artifacts in digital form. Like a paper library, we provide free access to researchers, historians, scholars, people with print disabilities, and the general public. Our mission is to provide Universal Access to All Knowledge.

We began in 1996 by archiving the Internet itself, a medium that was just beginning to grow in use. Like newspapers, the content published on the web was ephemeral – but unlike newspapers, no one was saving it. Today we have 28+ years of web history accessible through the Wayback Machine and we work with 1,200+ library and other partners through our Archive-It program to identify important web pages.

As our web archive grew, so did our commitment to providing digital versions of other published works. Today our archive contains:

This has been a lifesaver for me over the years. Plug in a URL and it will give you a calendar of days when the site was cached. For example with the Blandin on Broadband blog you can access the site going back to 2009!

International Wolf Center in Ely gets $15,000 for digital marketing campaign

IRRR (Iron Range Resources and Rehabilitation) announces recent grant awards, including one that I thought I could fit here…

International Wolf Center in Ely: $15,000 to develop a digital marketing campaign to attract new guests and visitors to Ely, including underrepresented audiences such as communities of color, tribal nations and veterans. Total project investment is estimated at $30,000.

I figured if I mentioned that I could also mention upcoming grant opportunities…

New Culture & Tourism grant cycle opens Jan. 2

Cities, townships, nonprofits, tribal governments and governmental entities located within the agency’s service area are eligible to apply. Key application dates:

  • Jan. 2 – 25: Pre-applications must be submitted by 4:30 p.m.
  • Jan. 31: Full grant applications and required documentation must be submitted by 4:30 p.m.

Applications must be submitted through the FLUXX grant portal. Access the FLUXX portal.

Watch a short two-minute video about Culture & Tourism grants.

Reports say workers with cameras off might be hurting their opportunities

Because most of us are living in a Zoomful world, I thought people might find this interesting; Axios reports

Stunning stat: 92% of executives at medium to large firms think workers who turn cameras off during meetings don’t have long-term futures at the company, according to a new survey from Vyopta, a software company.

Why it matters: The data adds grist to the worry that hybrid and remote employees have expressed about the post-pandemic world — that those who choose to work from home some, most or all of the time will be out-of-sight, out-of-mind for bosses.

They offer a suggestion…

One way to get everyone on the same page is to be more intentional — and explicit — about which meetings should be camera off and which should be camera on, Slate’s Torie Bosch writes.

  • If it’s a get-to-know-you for a big team, tell people ahead of time to prepare to show their faces.

  • If it’s a quick update on an ongoing project, everybody goes dark. Especially if it’s before 9 a.m.

It’s worth noting that sometimes people turn off the camera because they don’t have broadband for the full experience; just another reason we need ubiquitous broadband. And for what it’s worth, I like to walk and take Zoom calls – unless I’m running or presenting at the meeting. It means I don’t take great notes but I do pay better attention!

Lincoln County students learn about downsides of Internet and Social media

I’m thankful to the Tyler Tribute for letting me reprint their article on a recent meeting of students and lawyers about some tricky areas of internet and social media use by teens. I have done similar training in the past so I know how important it is. Often kids are given a very powerful tool with limited safety training, which can be dangerous. Lincoln County schools (with help from the Blandin Foundation) found a way to open dialogue…

Three schools gather at RTR for assembly on downside of the internet

Tuesday, March 22 the students in grades 5-8 from RTR Public school, along with Hendricks Middle School and Lake Benton Elementary, met in the RTR Performing Arts Center for an informative meeting about

the downside of the internet. The presentation was given by Joshua Heggem and Kristi Hastings of Pemberton Law Firm, located in Fergus Falls. The presentation was brought to the schools by the efforts of the Lincoln County Sheriff’s Department.

Hastings has represented numerous school districts for many years and talked about social media, technology and mistakes that other kids have made on social media that in turn will hopefully be a good learning tool to

prevent kids making these mistakes themselves.

This presentation came about as a response to the amount of cases they were seeing coming in, “When we started this, it came about because we were seeing so many disciplinary things coming across our desks. Expulsions and other serious consequences; Three kids getting kicked out of sports they love playing because of mistakes they were making on social media,” Hastings told the group. They came up with this presentation with a desire to get ahead of the rise they were seeing in cases based around social media, bullying using social media, and technology use and the dangers it presents.

Statistically, 97% of all kids in grades 5-8 are using social media of some sort every day. “I’m a huge fan of social media myself, so you’re not going to hear that it’s bad or that you shouldn’t use it at all because there are so many positive things that come with social media—the ability to connect with people all over the world, communicate with family and friends—these are all positive things that prior generations didn’t have.”

Hastings went on, “We are just focusing on the downside of social media and unfortunately, as lawyers, we see a lot of it.” Joshua Heggem shared a story of how quickly things can happen when social media is involved. “An instance I had once; a group of seventh graders who had made a Snapchat group for their class—they made it with the intent of bullying one classmate.

During these hateful comments aimed at the student, someone said they were going to put a hit out on the classmate. Within hours there were sheriffs at the school interrogating kids for terroristic threats.” Heggem recanted to the kids, “Some kids were charged with crimes; kids were getting suspended. The kid who made the threat, I believe was expelled from school.” Heggem made it clear that expulsion comes with heavy consequences, “That means you can’t set foot on school grounds, you can’t play any sports, you can’t even go to a sporting event, you can’t go to the football field.” Along with all those who faced charges and school consequences, there were also kids that needed mental health services after the ordeal, including the child who had been the subject of the bullying. Even if the kid who said the threat never meant it, the words were still out there on social media and have to be taken seriously. Heggem made it clear to the kids that things can’t be taken back once said on social media no matter how safe or secure you think it is. Hastings touched on things that don’t happen on school grounds; for instance, a kid initiating a fight at the park across the street of the school as opposed to on school grounds. “These school rules follow you when you are at a school sponsored event, when you’re here on school grounds, but also when you do things that negatively impact other kids’ ability to come here and learn,” Hastings explained.

This brought them to the next topic, “We do have a state law here in Minnesota that prohibits bullying of your classmates; things that are intimidating, threatening, abusive or harmful,” Hastings touched on. “Any bullying

that you carry over online is treated the same way. So, for instance, if you push a kid into a locker, that is the equivalent of bullying online and will carry the same punishment.”

They brought up “group thought” which is the concept that someone comes up with an idea and the group just goes along with it. “It happens a lot in our school cultures and climates because kids have not fully developed. Often times, the ability to say no I’m not interested in that idea/activity,” Hastings explained. An example used was one of another small school in Oakes, North Dakota which gained national news recognition.

“They had a tradition there of making a straw man before the homecoming game every year. So they would make the straw man and then burn it in a bonfire and then play their game,” Hastings told the kids. “A couple of

years ago, someone in a group came up with an idea—let’s make a noose and hang the straw man. Then someone comes up with the idea to put a jersey on it. Well, they put the number of the only player that is a person of color for the other team on the jersey. Someone in the group took a video of it, probably shared it with their close friends and contacts and someone recognized it was quite racist and it made national headlines. What it does, is it makes the world look at your school and question who lives there, what are they teaching here,” Hastings further explained to the kids.

The presentation touched on many topics that kids today are coming in contact with more and more every day—things like sending/receiving nude photos being a technical form of child pornography which is punishable by law, sharing pictures of your friends as a joke from the locker room is a form of privacy invasion and punishable by law. All the topics were relevant and appropriate.

Another presentation was given for the high school grades 9-12, after the middle school was done as it is a topic of discussion worth having from middle school on.

Sen Klobuchar proposes bill to help social media channel misinformation

Broadband Breakfast reports

Sen. Amy Klobuchar, D-Minnesota, introduced a bill Thursday that would remove online platforms’ Section 230 liability protections when the platforms are used to spread misinformation about coronavirus vaccines or other public-health emergencies.

Section 230 of the Communications Decency Act protects online platforms like Facebook and Twitter from civil liability for third-party content posted on their platforms. The measure has come under intense scrutiny over the past year, with prominent figures from both major political parties calling for reform.

Klobuchar said she decided to pursue new legislation because previous attempts to persuade Facebook to regulate the content have not been successful, the Wall Street Journal reports.

Interesting to hear some of the reasoning…

The bill’s introduction cites a report from the Center for Countering Digital Hate, which says that only 12 social media pages are responsible for a significant amount of false information being circulated about vaccines.

Last week, President Joe Biden said that Facebook was “killing people” by spreading misinformation about coronavirus vaccines. Biden later clarified his statement, saying that he wasn’t accusing the company of murder, but wanted them to “do something about the misinformation.”

The following day, Facebook rejected Biden’s criticism in a blog post, saying that 85 percent of its U.S. users either want to be or already have been vaccinated, citing this as evidence that Facebook was not the reason Biden’s goal of a 70 percent vaccination goal by July was not Facebook’s fault. Facebook said it was helping efforts to vaccinate the country by operating vaccine clinics in low-income communities in several states.

Blandin Broadband Lunch Bunch Digital Ready Communities Notes and Video

Thanks so much to everyone who came to the Lunch Bunch today and especially to Annie Cruz-Porter, Calla Jarvie and Emily Del Real for coming to talk about the Digital Ready Communities program. One fun offshoot of the Fall Broadband conference was that three Minnesota communities were able to work as pilots with the program at Purdue University. Today we got to loop back with the program and partners.

This is a fascinating program that helps communities focus on how folks in a community are connecting with each other and the outside world, especially online. It includes a assessment, a survey and creating a team to be more purposeful about building local, trusted channels for communication as well as creating a message that promotes the community to the outside world.

Register for future Lunches: Upcoming May 12 and May 26

And here’s the chat Continue reading