In just a few weeks, Americans will change clocks forward an hour to “spring ahead” to enjoy more evening sunlight and kick off daylight saving time 2026.
This year, DST begins Sunday, March 8, at 2 a.m., when clocks move forward one hour from 2 a.m. to 3 a.m., welcoming longer evenings in the warmer months.
DST will end Sunday, Nov. 1 when clocks “fall back” from 3 a.m. to 2 a.m., a move that will return the country back to standard time, sometimes called “winter time,” for the cooler months.
Under current U.S. federal law, DST begins the second Sunday in March and ends on the first Sunday in November every year.
In 2026, DST will end the earliest it can ever possibly end on Nov. 1, 2026.
History of daylight saving time in the U.S.
The U.S. first introduced clock changes in 1918 under the name “Fast Time.”
The change held only for a few years. It was later reintroduced during World War II under the term “War Time.”
However, it wasn’t until the Uniform Time Act of 1966 that clock changing in the U.S. had official rules and legislation for the ritual.
Clock change controversy
DST isn’t universally loved.
A 2025 Gallup Poll found more than half of Americans want to do away with the practice altogether, though whether they’re in favor of DST or standard time varies widely.
Compared with results from Gallup’s 1999 clock-change survey, public support for daylight saving time has plummeted across all demographic groups.
“Most subgroups — by age, political affiliation, income and education — have seen declines in support for DST of 30 percentage points or more, except for low-income Americans, who show a 19-point drop,” the analytics company reports.
Survey results show 48% of Americans prefer permanent Standard Time over DST, while 24% responded in favor of permanent DST.
Only 19% voted in favor of switching between the two.
British startup Synthesia, whose AI platform helps companies create interactive training videos, has raised a $200 million Series E round of funding that brings its valuation to $4 billion — up from $2.1 billion just a year ago.
Unlike some other AI startups that are still a long way from turning a profit, Synthesia has found a lucrative business in transforming corporate training thanks to AI-generated avatars. With enterprise clients including Bosch, Merck, and SAP, the London-based company crossed $100 million in annual recurring revenue (ARR) in April 2025.
Aside from ongoing support, this round will bring both new and departing investors. On one hand, Matt Miller’s VC firm Evantic and the secretive VC firm Hedosophia are joining the cap table as new entrants. On the other hand, Synthesia will facilitate an employee secondary sale in partnership with Nasdaq, TechCrunch has learned.
To be clear, Synthesia isn’t going public just yet — Nasdaq isn’t acting as a public exchange in this operation, but as a private markets facilitator that will help early team members turn their shares into cash. These employee stock sales often happen outside of this framework, but usually at prices either below or above the company’s official valuation, and are sometimes frowned upon by other shareholders. With this process, all sales will be tied to the same $4 billion valuation as Synthesia’s Series E, while the company keeps an element of control.
“This secondary is first and foremost about our employees,” Synthesia CFO Daniel Kim told TechCrunch. “It gives employees a meaningful opportunity to access liquidity and share in the value they’ve helped create, while we continue to operate as a private company focused on long-term growth.”
For Synthesia, this long-term growth involves going beyond expressive videos and embracing the AI agents trend. According to a press release, the company is developing AI agents that will let its clients’ employees “interact with company knowledge in a more intuitive, human-like way by asking questions, exploring scenarios through role-play, and receiving tailored explanations.”
Techcrunch event
San Francisco | October 13-15, 2026
The company said early pilots have received positive feedback from customers, who reported higher engagement and faster knowledge transfer compared to traditional formats. This positive response explains why Synthesia now plans to make agents a “core strategic focus” to invest in, alongside further product improvements to its existing platform.
While it didn’t disclose revenue forecasts, the company hopes its platform will offer a welcome answer to the struggles of enterprises in keeping their workforce adequately trained despite rapid changes. “We see a rare convergence of two major shifts: a technology shift with AI agents becoming more capable, and a market shift where upskilling and internal knowledge sharing have become board-level priorities,” Synthesia’s co-founder and CEO Victor Riparbelli said in a statement.
Seeing boards care more about employees as a result of AI wasn’t on anyone’s bingo card, except perhaps Riparbelli. Together with his cofounder, Synthesia COO Steffen Tjerrild, Riparbelli took the initiative of conducting a secondary sale so that employees could share in the success of the unicorn company. Founded in 2017, Synthesia now has more than 500 team members, a 20,000-square-foot HQ in London, and additional offices in Amsterdam, Copenhagen, Munich, New York City, and Zurich.
While unusual for a British startup, this coordinated secondary sale isn’t a first and likely not a last, Synthesia’s head of corporate affairs and policy, Alexandru Voica told TechCrunch. “My guess is that as [U.K.-based] private companies stay private longer, this type of structured, cross-border employee liquidity may become increasingly common, so I wouldn’t be surprised to see others do it, either with Nasdaq or others,” he predicted.
The New England Patriots and the Seattle Seahawks will face off in Super Bowl LX. For those of you who just can’t with Roman numerals, that’s Super Bowl 60, and it’s taking place this year at Levi’s Stadium in Santa Clara, CA, on February 8, starting at 6:30 p.m. ET.
Like all other Sunday Night Football games this season, the championship game will be broadcast on Super Bowl Sunday on NBC, and will stream live on Peacock. And it’s not just the game that we’re excited for, either. This year’s halftime performer is singer and rapper Bad Bunny, and there will be pre-game performances by Charlie Puth, Brandi Carlile, and Coco Jones. It’s truly an incredible lineup of talent. Here’s everything you need to know to tune in to Super Bowl LX when it airs on Feb. 8.
How to watch Super Bowl LX
Date: Sunday, Feb. 8, 2026
Time: 6:30 p.m. ET
TV channel: NBC, Telemundo
Streaming: Peacock, DirecTV, NFL+ and more
2026 Super Bowl game time
The 2026 Super Bowl is set to begin at 6:30 p.m. ET/3:30 p.m. PT on Feb. 8, 2026.
2026 Super Bowl game channel
The 2026 Super Bowl will air on NBC, with a Spanish-language broadcast available on Telemundo.
2026 Super Bowl teams:
The New England Patriots and the Seattle Seahawks will play in the 2026 Super Bowl.
Where is the 2026 Super Bowl being played?
The 2026 Super Bowl will be held at Levi’s Stadium in Santa Clara, CA, home of the San Francisco 49ers.
What teams are playing in the 2026 Super Bowl?
The teams for the 2026 Super Bowl will be determined after the AFC and NFC Championship games are played on Sunday, Jan. 25. You can keep tabs on the post-season playoff bracket here.
How to watch the 2026 Super Bowl without cable
You can stream NBC and Telemundo on platforms like DirecTV and Hulu + Live TV, both of which are among Engadget’s choices for best streaming services for live TV. (Note that Fubo and NBC are currently in the midst of a contract dispute and NBC channels are not available on the platform.) The game will also be streaming on Peacock and on NFL+, though with an NFL+ subscription, you’re limited to watching the game on mobile devices.
For $11/month, an ad-supported Peacock subscription lets you stream live sports and events airing on NBC, including the 2026 Super Bowl, Winter Olympics coverage, and more. Plus, you’ll get access to thousands of hours of shows and movies, including beloved sitcoms such as Parks and Recreationand The Office, every Bravo show and much more.
For $17 monthly you can upgrade to an ad-free subscription which includes live access to your local NBC affiliate (not just during designated sports and events) and the ability to download select titles to watch offline.
In addition to hosting NBC’s Super Bowl broadcast, DirecTV’s Entertainment tier gets you access to loads of channels where you can tune in to college and pro sports throughout the year, including ESPN, TNT, ACC Network, Big Ten Network, CBS Sports Network, and, depending on where you live, local affiliates for ABC, CBS, Fox, and NBC.
Whichever package you choose, you’ll get unlimited Cloud DVR storage and access to ESPN Unlimited.
DirecTV’s Entertainment tier package is $89.99/month. But you can currently try all this out for free for 5 days. If you’re interested in trying out a live-TV streaming service for football, but aren’t ready to commit, we recommend starting with DirecTV.
Who is performing at the 2026 Super Bowl halftime show?
Bad Bunny, who holds the title as the most-streamed artist in the world, will be headlining the 2026 Super Bowl halftime performance. You can expect that show to begin after the second quarter, likely between 8-8:30 p.m. ET. Singer Charlie Puth will also be at the game to perform the National Anthem, Brandi Carlile is scheduled to sing “America The Beautiful,” and Coco Jones will perform “Lift Every Voice and Sing.”
Where to buy tickets to the 2026 Super Bowl:
Tickets to the 2026 Super Bowl are available on third-party resale platforms like StubHub and Gametime.
A major ice and snow storm is hitting huge parts of the United States this week, and in preparing for the storm, I was watching the forecasts closely across a slew of different weather apps for Android. And, sadly, Pixel Weather was perhaps one of the least accurate I was using.
Millions are being affected this weekend by a major winter storm that’s leaving the southern US covered in ice, and the northeast bracing for inches of snow. And, through the past week, I’ve been checking several different weather apps on Android trying to get a sense of what to expect. That was both for my personal safety, and for a charity tournament that I really didn’t want to reschedule knowing what lies ahead for an icy North Carolina in the next couple of weeks. With a lot on the line, finding the most accurate forecast was really important to me.
Yet, as many of you know, no two weather apps will ever say exactly the same thing.
Cutting right to the chase, at least in my area of Winston-Salem, North Carolina, AccuWeather gets the crown. It was the only app that predicted that this storm would deliver more ice than snow, and it always had the timeline right of it landing on Saturday evening. And that really aligned with my use of AccuWeather over the past couple of years. While it can certainly be wrong, as any weather forecast can, more often than not it gives me a pretty accurate long(ish)-range forecast. Weather Channel was right up there as well, though I loathe using that website as it has so many ads it actively slows down any device I’m using.
What stood out to me this week was just how far off Pixel Weather was, and how far off it continues to be in the midst of the storm.
Pixel Weather was adamant that my area would see a lot of snow, with the forecast ranging from 10-30 inches depending on the day. And through the week, it could never decide on when that would start, with times ranging from mid-morning to late at night. Most of the other apps I was using were all pretty certain that the precipitation wouldn’t begin until at least the late afternoon.
But, frankly, that’s why I was using several different forecasting apps in the first place. I always knew that no one would agree and that, ultimately, I’d need to keep an eye on everything and look for the patterns to figure out what would happen. That’s how I figured out that Pixel Weather was pretty far off the mark. And I wasn’t alone there. As Droid-Life highlighted, despite Google using weather models to make its predictions, its AI applied to those models seems to miss the mark when what’s coming isn’t all that straightforward.
And that’s continuing right in the middle of the storm.
I’m writing this after having spent a couple of hours shoveling snow that fell overnight, around an inch or two as most of the forecasts predicted would happen. But as I was out in the cold, Pixel Weather insisted that I should be seeing snow right there and then as freezing rain was pelting me in the face, as it’s been doing most of the morning. A lot more ice is what’s in the forecast through the next 12 hours in my area, but Pixel Weather insists that not only should it be snowing right this minute, but that snow is all we’ll be seeing through the end of today. The only mention of ice is buried under the “Precipitation” page where Google doesn’t give any nuance whatsoever in what should be coming later today.
For someone in my area, looking solely at the Pixel Weather app would be a problem.
Snow is one thing, but travel in ice can be very dangerous, and the amount we’ll get should play a huge role in preparation. Foreca, one of the other weather apps that I felt was the least accurate this week, has the same issue. No mention of ice tonight, just a prediction of 8 inches of snow that no one else is making.
The NWS alert warns about, but Pixel Weather says to expect nothing but snow
Of course, this is all anecdotal.
Going back to those reports that Droid-Life surfaced, a clear-cut example of Pixel Weather’s deficiencies can be seen up in Canada, where there’s an 8-degree difference in the temperature Pixel Weather is showing versus what an actual weather station right nearby is showing, as one user reports. And in reply, just like me, there are users who noticed just how different Google’s forecast was compared to others over the past few days.
Predicting the weather is very complicated, especially in storms like this one. But these complicated situations serve as a really good reminder that you shouldn’t get all of your information from one place. The weather app on your phone, regardless of which one you’re using, will never be fully accurate.
Let us know in the comments below what weather apps you’ve been using and, for everyone affected by this storm, stay safe!
A woman died Sunday afternoon after a snow plow backed into her as she walked through the MBTA’s Norwood Central parking lot with her husband, according to Transit Police.
The collision happened around 2 p.m. when a Ford F-350 driver who was helping with snow removal struck the couple while driving in reverse, MBTA Transit Police Superintendent Richard Sullivan said in an email.
The 51-year-old woman was declared dead as a result of her injuries, Sullivan said. The man, 47, was taken to a hospital with non-life-threatening injuries.
The snow plow driver, a 33-year-old man, stayed at the scene and cooperated with police, Sullivan said. At the time of the collision, he was working for a private company the MBTA had contracted with to help with snow removal amid the snowstorm.
Transit Police detectives are working with the Norfolk County District Attorney’s Office to investigate the woman’s death.
“This is an unimaginable, horrific incident,” Sullivan wrote in the email. “On behalf of the Transit Police and the entire MBTA organization, we express our most sincere condolences to the victim’s family and friends.”
No further information, including the woman’s identity, had been released as of Sunday night.
Sunny Sethi, founder of HEN Technologies, doesn’t sound like someone who’s disrupted an industry that has remained largely unchanged since the 1960s. His company builds fire nozzles — specifically, nozzles that it says increase suppression rates by up to 300% while conserving 67% of water. But Sethi is matter-of-fact about this achievement, more focused on what’s next than what’s already been done. And what’s next sounds a lot bigger than fire nozzles.
His path to firefighting doesn’t follow a tidy narrative. After nabbing his PhD at the University of Akron, where he researched surfaces and adhesion, he founded ADAP Nanotech, an outfit that developed a carbon nanotube-based portfolio and won Air Force Research Lab grants. Next, at SunPower, he developed new materials and processes for shingled photovoltaic modules. When he landed next at a company called TE Connectivity, he worked on devices with new adhesive formulations to enable faster manufacturing in the automotive industry.
Then came a challenge from his wife. The two had moved from Ohio to the East Bay outside San Francisco in 2013. A few years later came the Thomas Fire — the only megafire they’d ever see, they thought. Then came the Camp Fire, then the Napa-Sonoma fires. The breaking point came in 2019. Sethi was traveling during evacuation warnings while his wife was home alone with their then three-year-old daughter, no family nearby, facing a potential evacuation order. “She was really mad at me,” Sethi recalls. “She’s like, ‘Dude, you need to fix this, otherwise you’re not a real scientist.’”
A background spanning nanotechnology, solar, semiconductors, and automotive had made his thinking “bias free and flexible,” as he puts it. He’d seen so many industries, so many different problems. Why not try to fix the problem?
In June 2020, he founded HEN Technologies (for high-efficiency nozzles) in nearby Hayward. With National Science Foundation funding, he conducted computational fluid dynamics research, analyzing how water suppresses fire and how wind affects it. The result: a nozzle that controls droplet size precisely, manages velocity in new ways, and resists wind.
In HEN’s comparison video, which Sethi shows me over a Zoom call, the difference is stark. It’s the same flow rate, he says, but HEN’s pattern and velocity control keep the stream coherent while traditional nozzles disperse.
But the nozzle is just the beginning — what Sethi calls “the muscle on the ground.” HEN has since expanded into monitors, valves, overhead sprinklers, and pressure devices, and is launching a flow-control device (“Stream IQ”) and discharge control systems this year. According to Sethi, each device contains custom-designed circuit boards with sensors and computing power — 23 different designs that turn dumb hardware into smart, connected equipment, some powered by Nvidia Orion Nano processors. Altogether, says Sethi, HEN has filed 20 patent applications with half a dozen granted so far.
Techcrunch event
San Francisco | October 13-15, 2026
The real innovation is the system these devices create. HEN’s platform uses sensors at the pump to act as a virtual sensor in the nozzle, tracking exactly when it’s on, how much water flows, and what pressure is required. The system captures precisely how much water was used for a given fire, how it was used, which hydrant was tapped, and what the weather conditions were.
Why it matters: Fire departments can run out of water otherwise, because there’s no communication between water suppliers and firefighters. It happened in the Palisades Fire. It happened in the Oakland Fire decades earlier. When two engines are connected to one hydrant, pressure variations can mean that one engine suddenly gets nothing as a fire continues to grow. In rural America, water tenders, which are tankers shuttling water from distant sources, face their own logistical nightmares. If they can integrate water usage calculations with their own utility monitoring systems to optimize resource allocation, that’s a giant win.
So HEN built a cloud platform with application layers, which Sethi likens to what Adobe did with cloud infrastructure. Think Individual à la carte systems for fire captains, battalion chiefs, and incident commanders. HEN’s system has weather data; it has GPS in all devices. It can warn those on the front lines that the wind is about to shift and they’d better move their engines, or that a particular fire truck is running out of water.
The Department of Homeland Security has been asking for exactly this kind of system through its NERIS program, which is an initiative to bring predictive analytics to emergency operations. “But you can’t have [predictive analytics] unless you have good quality data,” Sethi notes. “You can’t have good quality data unless you have the right hardware.”
HEN isn’t monetizing that data yet. It’s implementing data nodes, putting devices in as many systems as possible, building the data pipeline, creating the data lake. Next year, says Sethi, it will start commercializing the application layer with its built-in intelligence.
If building a predictive analytics platform for emergency response sounds daunting, Sethi says actually selling it is tougher, and he’s proudest of HEN’s traction on that front.
“The hardest part of building this company is that this market is tough because it’s a B2C play when you think of convincing the customers to buy, but the procurement cycle is B2B,” he explains. “So you have to really make a product that resonates with people — with the end user — but you still have to go through government purchasing cycles, and we have cracked both of those.”
The numbers bear this out. HEN launched its first products into the market in the second quarter of 2023, lining up 10 fire departments and generating $200,000 in revenue. Then word started to spread. Revenue hit $1.6 million in 2024, then $5.2 million last year. This year, Hen, which currently has 1,500 fire department customers, is projecting $20 million in revenue.
HEN has competition, of course. IDEX Corp, a public company, sells hoses, nozzles, and monitors. Software companies like Central Square serve fire departments. A Miami company, First Due, which sells software to public safety agencies, announced a massive $355 million round last August. But no company is “doing exactly what we are trying to do,” insists Sethi.
Still, Sethi says that the constraint isn’t demand — it’s scaling fast enough. HEN serves the Marine Corps, US Army bases, Naval atomic labs, NASA, Abu Dhabi Civil Defense, and ships to 22 countries. It works through 120 distributors and recently qualified for GSA after a year-long vetting process (that’s a federal seal of approval that makes it easier for military and government agencies to buy).
Fire departments buy about 20,000 new engines each year to replace aging equipment in a national fleet of 200,000, so once HEN is qualified, it becomes recurring revenue (is the idea), and because the hardware generates data, revenue continues between purchase cycles.
HEN’s dual goal has required building a very specific team. Its software lead was formerly a senior director who helped build Adobe’s cloud infrastructure. Other members of HEN’s 50-person team include a former NASA engineer and veterans from Tesla, Apple, and Microsoft. “If you ask me technical questions, I would not be able to answer everything,” Sethi admits with a laugh, “but I have such good teams that [it] has been a blessing.”
Indeed, it’s the software that hints at where this gets interesting, because while HEN is selling nozzles, it’s amassing something more valuable: data. Highly specific, real-world data about how water behaves under pressure, how flow rates interact with materials, how fire responds to suppression techniques, how physics works in active fire environments.
It’s exactly what companies building so-called world models need. These AI systems that construct simulated representations of physical environments to predict future states require real-world, multimodal data from physical systems under extreme conditions. You can’t teach AI about physics through simulations alone. You need what HEN collects with every deployment.
Sethi won’t elaborate, but he knows what he’s sitting on. Companies training robotics and predictive physics engines would pay handsomely for this kind of real-world physics data.
Investors see it, too. Last month, HEN closed a $20 million Series A round, plus $2 million in venture debt from Silicon Valley Bank. O’Neil Strategic Capital led the financing, with NSFO, Tanas Capital, and z21 Ventures participating. The round brought the company’s total funding to more than $30 million.
Sethi, meanwhile, is already looking ahead. He says the company will return to fundraising in the second quarter of this year.
Never underestimate the chilling powers of grainy grayscale imagery and ethereal whooshing sounds. Outside Parties asks, “What if I Spy, but in an alien hell dimension?”, and it is impressively unnerving despite the fact that nothing’s really happening at any given time. It goes all in on atmosphere, to great effect. This is the Playdate horror game that I’ve been waiting for.
Adams Immersive’s Outside Parties is a sort of scavenger hunt across a massive image of a realm called the Outside, which can only be visited by astral travel, according to the lore. There are lots of unknowns about what or where it really is, though explorers have mapped it fairly extensively through out-of-body excursions and they’ve encountered thousands of different entities there, including the spirits of the dead. As the player, you have come across a Hellscryer K5 — the communication device, psychic camera and recorder used for these trips — and now you’re combing through the mission logs, getting sucked into the mystery of it all. Think of the K5 as your Playdate, except powered by blood and runes.
At the center of Outside Parties is a 1.44 gigapixel, 360-degree panoramic HDR image which has dozens of eerie scenes hidden within it: skeletons of human, animal and paranormal origin; scary robed figures and occult symbols etched all around; what appear to be fountains and rivers of blood; a Stonehenge of teeth. These are the targets you’re meant to track down, and as you hone in and check them off your list, voice signals attached to each one will reveal more and more of the explorer’s spellbinding story.
But this isn’t a straightforward “find the object” puzzle game by any means. When you first look at the zoomed-out photo, it’s akin to a strip of TV static with some heavily shadowed areas throughout. You can zoom to up to 64 times magnification to get a better look at specific zones, but you also have to adjust the image brightness using the crank to improve the clarity of the objects. Making it brighter or darker will reveal more objects in certain spots while simultaneously obscuring others. There are 150 targets according to the developer, which should take players somewhere from 10-20 hours to complete. I’ve been at it for hours and still have plenty left to find. (If you’re stuck, you can turn to the helpful target lookup page, which provides hints with varying degrees of specificity.)
All the while as you’re hunched over your Playdate, laser-focused on the screen to find targets that are buried in a sea of fuzz, unsettling audio transmissions are cutting in and out, disturbing images are flashing on-screen at random and a constant atmospheric whooshing is playing in your ear. The sound design of this game is seriously brilliant — it’s worth playing for that alone, not to mention all the other cool stuff. From the startup page to the menus where you’ll find bits of a background story, to the creepy clips of people wailing and ominously reciting numbers, the sounds of Outside Parties make for a truly immersive, disconcerting experience that I previously wouldn’t have thought possible on a Playdate. It’s really something special.
Outside Parties also comes with a screensaver that once again makes me yearn for the Playdate Stereo Dock. Pop on the Void Monitor, sit back, and enjoy the horrifying sights and sounds of the Outside.
You’ll now see the last song you played on any device in the miniplayer when opening the mobile apps. “From your iPhone” or “From your browser” temporarily replaces the artist information until you start playback.
Your queues are simply kept in sync, with YouTube Music appearing to prioritize the last play session (on any device) to update/override existing queues.
Advertisement – scroll for more content
This is handy if you have a tablet alongside your phone or are a heavy user of YouTube Music on the web. A setting to control sync might be nice if your listening habits are device-dependent, but this is otherwise a good quality-of-life improvement.
More on YouTube Music:
FTC: We use income earning auto affiliate links.More.
A snowstorm began dumping heaps of snow on Massachusetts on Sunday, and heavy snowfall is expected to continue into the night, according to the National Weather Service.
All of the state is predicted to get between a foot and 20 inches of snow as the snowstorm continues through 8 p.m. Monday, according to the weather service’s winter storm warning. Parts of Northeast and Western Massachusetts could see up to two feet.
Rapid snowfall rates of 1 to 3 inches per hour are expected to continue until midnight on Sunday before calming down by 2 a.m., according to the weather service. Travel conditions will remain extremely poor during this time, and visibility while driving could deteriorate to a quarter-mile during the heaviest snowfall.
Lingering moisture is predicted to cause lighter snowfall through much of the day on Monday, according to the weather service. The snow is expected to stop in Eastern and Central Massachusetts by 8 p.m. Monday and by 9 or 10 p.m. in Western Massachusetts.
Below zero wind chills are predicted to follow the end of the snowstorm across the state Monday night, according to the weather service. Wind chills around -1 or -2 degrees are expected in Eastern Massachusetts, while wind chills around -6 or -7 are expected in Central Massachusetts and parts of Western Massachusetts. Wind chills as frigid as -10 degrees are possible in the Berkshires.
Overnight low temperatures Sunday night are predicted to dip into the mid 20s in Eastern Massachusetts and the mid teens in Central Massachusetts and parts of Western Massachusetts, according to the weather service. Lows in the single digits are expected in the Berkshires.
Highs during the day on Monday are predicted to reach the low to mid 30s in Eastern Massachusetts and the low to mid 20s in Central and Western Massachusetts, according to the weather service. Overnight lows Monday night are predicted to drop into the single digits in most of the state.
If your Gmail account didn’t seem to be working properly Saturday, you were not alone. But Google says the issue has been be resolved.
The official status dashboard for Google Workspace suggests that problems began at around 5am Pacific on Saturday morning, with users experiencing both “misclassification of emails in their inbox and additional spam warnings.”
For me, that meant my Primary inbox was filled with messages that would normally appear in the Promotions, Social, or Updates inboxes, and that spam warnings were appearing in emails from known senders. Other users complained on social media that “all the spam is going directly to my inbox” and that Gmail’s filters seem “suddenly completely busted.”
The Google dashboard was updated with messages throughout the day on Saturday saying that the company was still working to resolve the issue. Later on Saturday evening, Google posted that the issue had been “fully resolved for all users.”
“Some Gmail users experienced a misclassification of emails in their inbox, delays in receiving email,” the company said in a dashboard update. “Additionally, misclassified spam warnings from the incident may persist for existing messages received before the issue resolution.”
The company also said that it will “publish an analysis of this incident once we have completed our internal investigation.”
This post was first published on January 24, 2026. It has been updated to include Google’s statement that the issue has been resolved.