The First ‘Avengers: Doomsday’ Teaser Is Finally Here

https://gizmodo.com/app/uploads/2025/12/avengers-doomsday-trailer-steve-rogers-1280×853.jpg

It’s been nearly a decade since the last MCU movie to wear the “Avengers” banner (and not hide it), and now the Earth’s Mightiest Heroes are returning for Avengers: Doomsday.

Helmed by returning directors Joe and Anthony Russo, the fourth Avengers film sees two different Avengers teams—one with the recently christened New Avengers, the other brought together by Anthony Mackie’s Captain America—caught up in a war for the multiverse that also loops in older versions of the Fox X-Men, our latest iteration of the Fantastic Four, and even the Wakandans and Namor. Oh, and Doctor Doom, played by former Iron Man Robert Downey, Jr.

https://x.com/MarvelStudios/status/2003465624325095737

But instead of all that, this tease centers on Chris Evans’ Steve Rogers. The worst-kept secret of the film is the return of the original Captain America, who has himself a child with presumably Peggy Carter (Haley Atwell, also returning for this), since they sealed their time-displaced romance with a kiss at the very end of Endgame. What that means for the film’s plot is a big mystery, but with the subtitle, things likely won’t stay rosy for the Rogers family.

“Steve Rogers will return” when Avengers: Doomsday hits theaters on December 18, 2026, followed by Avengers: Secret Wars on December 17, 2027. And before Doomsday comes out, Marvel sure hopes you liked Endgame enough to see it on the big screen again in September 2026.

Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who.

Gizmodo

MySQL DBAs Are Landing Six-Figure Jobs in This Economy. And You Can Too!

https://webyog.com/wp-content/uploads/2025/12/database_schema.png

If you’re searching for a stable, well-paying career in the rapidly expanding world of data, consider becoming a MySQL Database Administrator. While tech layoffs make headlines, MySQL DBAs remain in constant demand as companies simply can’t afford to lose the experts who keep their critical data systems running 24/7.

In today’s data-driven business landscape, MySQL database administrators are among the most sought-after IT professionals, because MySQL is powering everything from small startups to tech giants like Facebook, Twitter, and YouTube. Skilled DBAs who can keep these critical systems running smoothly are essential to modern business operations.

Why MySQL DBAs Matter More Than Ever

In e-commerce, even minutes of downtime during peak shopping periods can cost millions, and for financial services, database performance directly impacts trading systems where milliseconds matter. Database administrators ensure that organizations’ most valuable asset—their data—remains available, secure, and performant. This responsibility becomes even more critical when you consider the real-world impact of database failures in healthcare. A database outage doesn’t just mean lost revenue, it could mean the difference between life and death when patient records become inaccessible during emergency care.

This critical importance is why MySQL DBAs command strong salaries and enjoy excellent job security. But there’s another reason MySQL skills are particularly valuable: MySQL is free to learn and practice, unlike proprietary systems like Oracle or SQL Server that can cost thousands just to set up a learning environment.

What MySQL DBAs Actually Do

MySQL database administrators wear many hats, combining technical expertise with business awareness. In a typical week, a MySQL DBA might:

  • Implement database changes during carefully planned maintenance windows
  • Refresh development databases with production data while protecting sensitive information
  • Diagnose and resolve performance bottlenecks affecting critical applications
  • Review and optimize queries that developers have submitted
  • Grant appropriate access permissions to new team members
  • Plan and test disaster recovery procedures

These diverse responsibilities require both deep technical knowledge and strong communication skills. DBAs work closely with developers, system administrators, and business stakeholders—making them central to IT operations.

The role appeals to professionals who enjoy variety, problem-solving, and having direct impact on business success. Unlike developers who might work on a single project for months, DBAs handle multiple challenges daily, each requiring quick thinking and careful execution.

Essential Skills Every MySQL DBA Needs

Building a foundation as a MySQL DBA starts with mastering core competencies. Here are the fundamental skills that separate professional DBAs from casual users:

Installation and Configuration: Know how to install MySQL, apply patches, configure settings for optimal performance, and set up automated monitoring. Understanding different MySQL variants (MariaDB, Amazon Aurora etc.) expands your opportunities since these skills transfer directly.

Security Management: Master the GRANT, REVOKE, and DENY commands to control data access. Understanding MySQL’s security model protects organizations from breaches that could cost millions and destroy reputations.

Backup and Recovery: Learn to perform full, differential, and incremental backups—and more importantly, practice restoring them. The ability to recover data quickly during a crisis defines a DBA’s value.

Performance Optimization: Understand how indexes work, when to use them, and how they impact query performance. What seems like minor tuning can mean the difference between queries that run in milliseconds versus minutes.

SQL Proficiency: Write complex queries using JOINs, subqueries, and aggregate functions. While tools can help generate SQL, understanding the underlying language helps diagnose problems and optimize performance.

Monitoring and Maintenance: Use tools to track database health, identify bottlenecks, and prevent problems before users notice them.

Essential MySQL Tools

Professional MySQL DBAs rely on specialized tools to work efficiently. At Webyog, we’ve developed two essential tools that thousands of DBAs use daily:

SQLyog provides a powerful IDE for database administration and development. Instead of remembering complex command-line syntax, DBAs use SQLyog to visually manage schemas, write queries with intelligent auto-completion, synchronize databases, and handle routine tasks more efficiently.

SQL Diagnostic Manager for MySQL (formerly Monyog) delivers real-time monitoring and performance analytics. It helps DBAs identify slow queries, track resource usage, spot security vulnerabilities, and receive alerts before problems impact users. The ability to monitor multiple MySQL instances from a single dashboard becomes invaluable as environments grow.

Mastering these professional tools alongside core MySQL skills prepares you for real-world DBA responsibilities and makes you more attractive to employers.

Your Learning Path Forward

The good news is that MySQL’s open-source nature makes it accessible to anyone willing to learn. You can download MySQL for free, set up a practice environment on your personal computer, and start building skills immediately. Cloud providers also offer free tiers where you can practice with managed MySQL services.

Start with the right tools: Download SQLyog Community Edition from GitHub as a free MySQL IDE to begin your learning journey. For those ready to experience the full professional toolkit, try SQLyog Ultimate Edition with a 14-day free trial – the same IDE that thousands of professional DBAs rely on daily. Join the Webyog Forums where over 15,000 MySQL users share solutions and answer questions. Having access to professional-grade tools from day one accelerates your learning and provides immediate practical experience.

Many successful MySQL DBAs are self-taught, combining hands-on practice with community resources. The MySQL documentation provides comprehensive reference material, while forums and FAQs offer real-world solutions to common challenges. Video tutorials and blog posts from experienced DBAs share practical tips that textbooks often miss.

For those who prefer structured learning, numerous online courses and certifications can accelerate your progress. While certifications don’t guarantee competence, they demonstrate commitment to potential employers—especially valuable when seeking your first DBA position.

The key is to start somewhere and maintain consistency. Set up your learning environment, join the community, and begin with basic tasks. As you gain confidence, tackle increasingly complex challenges. The path from beginner to professional is clear when you have the right resources and community support.

Taking the Next Step

The demand for skilled MySQL DBAs continues growing as organizations generate more data and rely increasingly on data-driven decisions. Whether you’re already in IT looking to specialize or considering a career change, MySQL offers an accessible path to a rewarding career.

The journey from beginner to professional MySQL DBA requires dedication but follows a clear path. Start with fundamentals, practice in a safe environment, engage with the community, and gradually take on more complex challenges.

Ready to begin your journey? We’ve created a comprehensive guide that maps out the complete path to becoming a MySQL DBA. Download our free whitepaper “How to Become a Database Administrator for MySQL” for detailed learning paths, specific resources, and a practical action plan to launch your DBA career. 

Prefer a shorter overview first?

Read our blog on how to become a MySQL database administrator to understand the skills, responsibilities, and progression before diving deeper.


FAQ: Your Path to Becoming a MySQL Database Administrator

Career Basics

Q: What exactly does a MySQL DBA do day-to-day?
A: MySQL DBAs manage and maintain database systems, ensuring data remains secure, available, and performs optimally. Daily tasks include implementing database changes, diagnosing performance issues, managing user permissions, performing backups, and working with developers to optimize queries. The role combines technical database work with cross-team collaboration.

Q: Do I need a computer science degree to become a MySQL DBA?
A: No. While a CS degree can be helpful, many successful MySQL DBAs come from diverse backgrounds including system administration, development, or even non-technical fields. What matters most is demonstrating practical skills, which you can learn through self-study, online courses, and hands-on practice with MySQL’s free Community Edition.

Q: How long does it take to become job-ready as a MySQL DBA?
A: With dedicated study and practice, you can build foundational DBA skills in 6-12 months. Most professionals recommend 200+ hours of hands-on learning before interviewing. The timeline varies based on your current IT experience and available study time, but consistency matters more than speed.

Skills & Learning

Q: What’s the most important skill for a MySQL DBA?
A: Backup and recovery is arguably one of the the most critical skills. Yyour value as a DBA is directly tied to your ability to protect and restore data during crises. Beyond that, understanding performance optimization through indexes and query tuning separates good DBAs from great ones. SQL proficiency and security management round out the core skills.

Q: Should I learn MySQL specifically or start with SQL in general?
A: Start with MySQL directly since it’s free to download and practice with. You’ll learn standard SQL as part of working with MySQL, but you’ll also gain MySQL-specific knowledge about storage engines, replication, and performance tuning that makes you immediately employable.

Q: Are MySQL skills transferable to other databases?
A: Yes. MySQL skills transfer well to MariaDB, Amazon Aurora, and other variants with minimal adjustment. Core concepts like SQL, indexing, backups, and performance tuning apply across most relational databases, though specific syntax and tools vary.

Tools & Resources

Q: What tools do I need to start learning MySQL administration?
A: Download MySQL Community Edition (free) and SQLyog Community Edition (free MySQL IDE from GitHub). These provide everything needed for learning. As you advance, consider SQLyog Ultimate to experience professional features, and explore SQL Diagnostic Manager for the critical role of monitoring.

Q: Where can I get help when I’m stuck?
A: The Webyog Forums have over 15,000 members sharing MySQL knowledge. The official MySQL documentation at dev.mysql.com is comprehensive. Stack Overflow’s MySQL tag provides answers to common problems. Starting with SQLyog’s Community Edition also gives you access to community support.

Q: Do I need expensive training courses?
A: No. MySQL’s open-source nature means abundant free resources exist. Start with free tutorials, documentation, and YouTube videos. Consider paid courses only after you’ve exhausted free options and want structured learning or specific certifications.

Career Prospects

Q: Are MySQL DBA jobs really paying six figures?
A: Yes, experienced MySQL DBAs commonly earn $100,000-$150,000+ in the US, with senior positions and high cost-of-living areas pushing higher. Entry-level positions typically start around $70,000-$80,000. Specialized skills in cloud platforms, automation, or large-scale systems command premium salaries.

Q: Is MySQL DBA a good career choice with all the tech layoffs?
A: MySQL DBAs have remained in constant demand even during tech downturns. Companies can’t afford to lose the experts who maintain their critical data infrastructure. Unlike some tech roles that can be outsourced or automated, DBAs need deep understanding of specific systems and business requirements.

Q: What’s the typical career progression for a MySQL DBA?
A: Common paths include: Junior DBA → DBA → Senior DBA → Lead DBA or Database Architect. Many DBAs also move into specialized roles like Performance Tuning Expert, Cloud Database Specialist, or transition to Data Engineering. Some become consultants or move into management as Database Managers or IT Directors.

SQLyog is part of the Idera family of tools.

 Idera delivers trusted, enterprise-grade tools that accelerate innovation across data, development, DevOps, and testing.

For Snowflake users, we also recommend exploring these companion solutions:

  • Yellowfin – A leader in embedded analytics and modern BI, with self-service dashboards that drive insight and engagement.
  • DataSync – Seamless integration of ServiceNow data into Snowflake (or other platforms), without the cost and performance overhead of APIs.
  • IDERA SQL Tools – Monitoring, auditing, and performance solutions for SQL Server environments—enterprise-proven and production-ready.
  • ER/Studio – Enterprise data modeling and metadata management that standardizes definitions, improves governance, and streamlines collaboration across complex data environments.

Planet for the MySQL Community

Watch ILM Recreate the Death Star Trench Run Out of Virtual Gingerbread

https://gizmodo.com/app/uploads/2025/12/star-wars-minis-ilm-death-star-trench-run-1280×853.jpg

Just in time for the holidays, Star Wars is celebrating in style with a cutesy recreation of the iconic Death Star trench run from A New Hope rendered as if it was painstakingly made out of gingerbread. But a simple festive sweet treat, this ain’t: it’s the first in a volley of shorts for a new animated miniseries, Star Wars Minis.

Lucasfilm has released the first of the shorts, a brief side-by-side comparison of the trench run sequences from the original Star Wars with the gingerbread recreation. It’s very cute, from the gingerbread cameos of Luke, Han, and Vader, to the gumdrop proton torpedoes fired to destroy the battle station (which blows up with a suitably adorable cookie aftershock ring).

But the short is really just a herald for a new series of similarly ideated shorts called Star Wars Minis, which will be less festively inclined. An accompanying behind-the-scenes video from ILM frames the new shorts a series of ways to explore beloved moments from across Star Wars film and TV in new styles and materials, utilizing new technologies developed by ILM.

Have no fear about “new technologies” just yet, in the wake of Disney’s attempts to embrace generative AI before the tech bubble bursts: Star Wars Minis looks to be modelling things actually crafted by ILM first, from printed, chibi-fied models of C-3PO and R2-D2 to hand-knitted crochet dolls of Yoda, Grogu, and more. The latter style definitely seems to be the focus of this teaser, with knitted riffs on multiple scenes from Phantom Menace, A New Hope, Empire Strikes Back, and Return of the Jedi, as well as The Mandalorian rendered in digital fuzzy felt.

It’s a fun way to create short little Star Wars riffs, especially with fun technological solutions to deliver them on a similarly smaller scale.

We’ll see more from Star Wars Minis in 2026.

Want more io9 news? Check out when to expect the latest Marvel, Star Wars, and Star Trek releases, what’s next for the DC Universe on film and TV, and everything you need to know about the future of Doctor Who.

Gizmodo

How To Measure The Impact Of Features

http://files.smashing.media/articles/how-measure-impact-features-tars/how-measure-impact-features-tars.jpg

So we design and ship a shiny new feature. How do we know if it’s working? How do we measure and track its impact? There is no shortage in UX metrics, but what if we wanted to establish a simple, repeatable, meaningful UX metric — specifically for our features? Well, let’s see how to do just that.

I first heard about the TARS framework from Adrian H. Raudschl’s wonderful article on “How To Measure Impact of Features”. Here, Adrian highlighted how his team tracks and decides which features to focus on — and then maps them against each other in a 2×2 quadrants matrix.

It turned out to be a very useful framework to visualize the impact of UX work through the lens of business metrics.

Let’s see how it works.

1. Target Audience (%)

We start by quantifying the target audience by exploring what percentage of a product’s users have the specific problem that a feature aims to solve. We can study existing or similar features that try to solve similar problems, and how many users engage with them.

Target audience isn’t the same as feature usage though. As Adrian noted, if we know that an existing Export Button feature is used by 5% of all users, it doesn’t mean that the target audience is 5%. More users might have the problem that the export feature is trying to solve, but they can’t find it.

Question we ask: “What percentage of all our product’s users have that specific problem that a new feature aims to solve?”

2. A = Adoption (%)

Next, we measure how well we are “acquiring” our target audience. For that, we track how many users actually engage successfully with that feature over a specific period of time.

We don’t focus on CTRs or session duration there, but rather if users meaningfully engage with it. For example, if anything signals that they found it valuable, such as sharing the export URL, the number of exported files, or the usage of filters and settings.

High feature adoption (>60%) suggests that the problem was impactful. Low adoption (<20%) might imply that the problem has simple workarounds that people have relied upon. Changing habits takes time, too, and so low adoption in the beginning is expected.

Sometimes, low feature adoption has nothing to do with the feature itself, but rather where it sits in the UI. Users might never discover it if it’s hidden or if it has a confusing label. It must be obvious enough for people to stumble upon it.

Low adoption doesn’t always equal failure. If a problem only affects 10% of users, hitting 50–75% adoption within that specific niche means the feature is a success.

Question we ask: “What percentage of active target users actually use the feature to solve that problem?”

3. Retention (%)

Next, we study whether a feature is actually used repeatedly. We measure the frequency of use, or specifically, how many users who engaged with the feature actually keep using it over time. Typically, it’s a strong signal for meaningful impact.

If a feature has >50% retention rate (avg.), we can be quite confident that it has a high strategic importance. A 25–35% retention rate signals medium strategic significance, and retention of 10–20% is then low strategic importance.

Question we ask: “Of all the users who meaningfully adopted a feature, how many came back to use it again?”

4. Satisfaction Score (CES)

Finally, we measure the level of satisfaction that users have with that feature that we’ve shipped. We don’t ask everyone — we ask only “retained” users. It helps us spot hidden troubles that might not be reflected in the retention score.

Once users actually used a feature multiple times, we ask them how easy it was to solve a problem after they used that feature — between “much more difficult” and “much easier than expected”. We know how we want to score.

Using TARS For Feature Strategy

Once we start measuring with TARS, we can calculate an S÷T score — the percentage of Satisfied Users ÷ Target Users. It gives us a sense of how well a feature is performing for our intended target audience. Once we do that for every feature, we can map all features across 4 quadrants in a 2×2 matrix.

Overperforming features are worth paying attention to: they have low retention but high satisfaction. It might simply be features that users don’t have to use frequently, but when they do, it’s extremely effective.

Liability features have high retention but low satisfaction, so perhaps we need to work on them to improve them. And then we can also identify core features and project features — and have a conversation with designers, PMs, and engineers on what we should work on next.

Conversion Rate Is Not a UX Metric

TARS doesn’t cover conversion rate, and for a good reason. As Fabian Lenz noted, conversion is often considered to be the ultimate indicator of success — yet in practice it’s always very difficult to present a clear connection between smaller design initiatives and big conversion goals.

The truth is that almost everybody on the team is working towards better conversion. An uptick might be connected to many different initiatives — from sales and marketing to web performance boost to seasonal effects to UX initiatives.

UX can, of course, improve conversion, but it’s not really a UX metric. Often, people simply can’t choose the product they are using. And often a desired business outcome comes out of necessity and struggle, rather than trust and appreciation.

High Conversion Despite Bad UX

As Fabian writes, high conversion rate can happen despite poor UX, because:

  • Strong brand power pulls people in,
  • Aggressive but effective urgency tactics,
  • Prices are extremely attractive,
  • Marketing performs brilliantly,
  • Historical customer loyalty,
  • Users simply have no alternative.

Low Conversion Despite Great UX

At the same time, a low conversion rate can occur despite great UX, because:

  • Offers aren’t relevant to the audience,
  • Users don’t trust the brand,
  • Poor business model or high risk of failure,
  • Marketing doesn’t reach the right audience,
  • External factors (price, timing, competition).

An improved conversion is the positive outcome of UX initiatives. But good UX work typically improves task completion, reduces time on task, minimizes errors, and avoids decision paralysis. And there are plenty of actionable design metrics we could use to track UX and drive sustainable success.

Wrapping Up

Product metrics alone don’t always provide an accurate view of how well a product performs. Sales might perform well, but users might be extremely inefficient and frustrated. Yet the churn is low because users can’t choose the tool they are using.

We need UX metrics to understand and improve user experience. What I love most about TARS is that it’s a neat way to connect customers’ usage and customers’ experience with relevant product metrics. Personally, I would extend TARS with UX-focused metrics and KPIs as well — depending on the needs of the project.

Huge thanks to Adrian H. Raudaschl for putting it together. And if you are interested in metrics, I highly recommend you follow him for practical and useful guides all around just that!

Meet “How To Measure UX And Design Impact”

You can find more details on UX Strategy in 🪴 Measure UX & Design Impact (8h), a practical guide for designers and UX leads to measure and show your UX impact on business. Use the code 🎟 IMPACT to save 20% off today. Jump to the details.



Video + UX Training

$ 495.00 $ 799.00

Get Video + UX Training

25 video lessons (8h) + Live UX Training.
100 days money-back-guarantee.

Video only

$ 250.00$ 395.00


Get the video course

25 video lessons (8h). Updated yearly.
Also available as a UX Bundle with 3 video courses.

Useful Resources

Further Reading

Smashing Magazine

Cut MySQL RDS Audit Log Costs by 95% with AWS S3

Detailed MySQL RDS audit logs are non-negotiable for security and compliance standards like PCI-DSS and HIPAA. However, a bloated cloud bill for storing these logs shouldn’t be your default reality.

This blog shows you how to strategically leverage AWS services to maintain full compliance while implementing massive cost savings using the Mydbops RDS LogShift tool. We’ll walk through a real client case where we reduced their annual audit log costs from over $30,000 to under $2,000. The client stayed on Amazon RDS for MySQL as the managed database platform, with no compromise in security or observability.

The $30,000 Story: How We Cut Our Client’s Audit Log Costs by 95%

One of our clients needed to retain MySQL audit logs for five years to meet compliance standards. They had enabled log streaming to Amazon CloudWatch Logs, which seemed like the straightforward solution. However, after seeing their AWS bill climb month after month, they reached out to us for a cost optimization review.

The problem was stark: they were generating 1 TB of audit data monthly, and nobody had looked closely at the retention settings after the initial setup.

Like many AWS users, they had left the CloudWatch Log Group’s default retention policy set to "Never Expire." This meant they were paying premium CloudWatch storage rates indefinitely.

Their Painful Cost Breakdown

CloudWatch Audit Log Cost Breakdown

1 TB MySQL RDS audit logs / month

Cost Component (Monthly for 1 TB) Calculation Annual Cost
CloudWatch Ingestion Fee 1,024 GB × $0.50/GB $6,144.00
CloudWatch Storage Fee 1,024 GB × $0.03/GB $368.64
Total Annual Cost (Recurring)
Key baseline
$6,512.64
Projected Cost (5 Years, Compounding Storage) $32,563.20

Based on 1 TB/month of MySQL RDS audit logs streamed to Amazon CloudWatch Logs with default retention.

If you already stream MySQL RDS logs into CloudWatch, this pattern may look familiar. For a deeper dive into how RDS features impact ongoing cloud cost, you can refer to the Mydbops article on Point-In-Time Recovery in MySQL RDS, which also discusses retention trade-offs and storage impact.

We recommended a different approach: keep only the minimum data required for immediate operational scans in CloudWatch and move everything else to cold storage. Here’s how we cut their RDS audit log costs by 95%.

Step 1: Optimize CloudWatch Retention to the Minimum

The first immediate relief came from capping the high-cost storage by managing the CloudWatch retention policy intelligently. The principle is simple: only keep the data you need for active, real-time operational scanning in CloudWatch Logs Insights. Everything else should be pruned.

We navigated to the Log Group in the AWS Console and changed the retention policy to 30 days. This ensured logs were automatically deleted after they passed their high-utility operational phase.

The Cost Impact of 30-Day Retention

This single change delivered two immediate benefits:

  • Eliminated the perpetual storage cost for any data older than 30 days
  • Minimized the volume of data scanned by Log Insights queries, reducing query costs

Step 2: The S3 Advantage for Long-Term Archival

With the operational window contained to 30 days, the next challenge was capturing and storing the long-term compliance data (5 years) cost-effectively.

The optimal solution is Amazon S3 with lifecycle policies. S3 allows data to move seamlessly through storage tiers, eventually landing in S3 Glacier Deep Archive where storage costs drop to approximately $0.00099 per GB—a 97% reduction compared to CloudWatch storage.

The math is compelling, but the real challenge was implementation: how do we get logs from RDS to S3 without continuing to pay those crushing CloudWatch ingestion fees?

In practice, this means the client could store the same 60 TB of cumulative audit logs over five years at a tiny fraction of what CloudWatch would have charged. If you want to see how Mydbops thinks about backups, long-term durability, and recovery windows on RDS, the blog on migrating MySQL data to RDS/Aurora using XtraBackup and the post on MySQL RDS Point-In-Time Recovery show how S3 is used across backup and restore workflows.

Step 3: Cutting Costs with Mydbops RDS LogShift

The final game-changing step ensured that future log volumes bypass the costly CloudWatch ingestion pipeline altogether and flow directly to S3 for archival. This is where the Mydbops RDS LogShift tool delivered the essential optimization.

By deploying RDS LogShift, we achieved immediate and sustained cost reduction that will compound over the entire 5-year retention period.

How RDS LogShift Achieved a 95% Saving

The core of our optimization lies in how Mydbops RDS LogShift strategically manages log flow, directly addressing the biggest cost drivers:

Bypassing Ingestion Fees (The Critical Save): This is the game-changer. RDS LogShift can either directly retrieve rotated audit logs from the RDS instance itself or pull existing logs within their short retention period in CloudWatch Logs. By doing this, the tool ensures your long-term archival data circumvents the exorbitant $0.50/GB CloudWatch ingestion fee entirely. This process becomes a simple data transfer, turning a major cost center into a minor operational expense.

Compression and Partitioning: The tool efficiently compresses logs (reducing storage volume) and pushes them to S3 with date-based partitioning. This makes it easy to download and query specific logs when needed for compliance audits or security investigations.

The Long-Term Results: Over $30,000 Saved

The cumulative savings achieved for our client over the 5-year retention period are substantial:

Cost overview

CloudWatch vs. optimized storage

Same audit log volume, two retention windows.

Period Cumulative log volume CloudWatch cumulative cost Optimized S3 cumulative cost Total savings
1 Year 12 TB $6,512 $350 $6,162
5 Years
near 95% saved
60 TB $32,563 $1,700 $30,863

By implementing the Mydbops RDS LogShift solution, our client gained full compliance while cutting their log costs by 94.7%. They maintained the same security posture and audit capabilities—just at a fraction of the cost.

Turn Your Audit Log Liability into a Cost-Saving Success Story

If you’re storing MySQL RDS audit logs in CloudWatch without a retention strategy, you’re likely overpaying by thousands of dollars annually. The solution doesn’t require compromising on compliance or security—it just requires smarter architecture.

Ready to see your AWS bill drop while maintaining full compliance? Contact Mydbops today to implement the RDS LogShift solution and start saving immediately.

Planet for the MySQL Community

The Meta Quest 3S is back down to its Cyber Monday all-time low of $250

The Meta Quest 3S is back on sale at its all-time low price of $250. That’s $50 off, or a discount of 17 percent, and matches a deal we saw on Cyber Monday. You can get the deal at Amazon and Best Buy, and the latter offers a $50 gift card with purchase.

The 3S is the more affordable model in the company’s current VR headset lineup. It features the same Snapdragon XR2 processor as the more expensive Meta Quest 3, but with lower resolution per eye and a slightly narrower field of view.

In our hands-on review, we gave the Meta Quest 3S a score of 90, noting how impressive the tech was compared to its price. The headset was comfortable to wear during longer gaming periods, and the performance was quick and responsive thanks largely to the upgraded processor and increased RAM from the Quest 2.

We were big fans of the new controllers, which the 3S shares with the more expensive Quest 3. This new generation of controller sports a more refined design, shedding the motion tracking ring and leaving behind a sleek form factor that fits in your hand like a glove.

We did miss the headphone jack, though most users are probably fine with the built-in speakers. You can wirelessly connect headphones for higher quality sound if you feel the need. The Quest 3S also recycles the old Fresnel lenses from the Quest 2, which can lead to some artifacts.

If you were considering a VR headset for yourself or a loved one this holiday season, the Meta Quest 3S offers an excellent value alongside impressive performance.

Follow @EngadgetDeals on X for the latest tech deals and buying advice.

This article originally appeared on Engadget at https://www.engadget.com/deals/the-meta-quest-3s-is-back-down-to-its-cyber-monday-all-time-low-of-250-144027382.html?src=rssEngadget