5 Free Online Games and Websites to Master Linux and the Command Line

https://static1.makeuseofimages.com/wordpress/wp-content/uploads/2023/07/linux-with-a-pengiun-as-the-i-letter.jpg

Learning Linux is essential for anyone working in the IT field. Linux distros are helpful to developers, system administrators, and cloud and network engineers.

Linux is popular because of its reliability and wide range of practical applications. If you want to know more about Linux, here are five websites that will help you learn it interactively. These sites have free games and exercises based on the Linux architecture and commands.

Linux Survival makes it easy to learn and master essential Linux commands. It will teach you everything you need to learn about Linux. In module one, you will learn about the Linux directory structure. You will also learn to create directories and delete files using the command line.

You can practice listing file contents, renaming directories, and locating documents. In advanced modules, you will learn to obtain user information and manage security.

At the end of each module, there’s a practice quiz to test your knowledge. With Linux Survival, you can play around with familiar data, such as animals in a zoo.

You learn how to manipulate the data using commands on the screen. You then type the commands on the interactive shell and see the results.

The interface is simple and easy for beginners as they get instructions and an interactive shell to practice. The best part is you don’t have to sign up to use the workspace. You can start learning as soon as you land on the website. But it’s recommended that you create an account to track your progress.

Terminus is a command-line game created by MIT (Massachusetts Institute of Technology). The game provides users with an interactive command-line interface to practice Linux. They provide a set of commands and instructions on how to use them.

The interface is excellent for beginners who want to learn how to interact with the command line. They provide data in files called locations that you can work with using the commands. For example, you must retrieve specific data to complete a challenge. You can also print information and change directories.

As you navigate through the directories, a picture on the terminal shows you where you are. This immerses your imagination in the game, making it fun and adventurous.

You can play Terminus without having to sign up on the website. Go ahead and explore this fun game.

Command Line Murder Mystery is a thrilling way to learn the Linux command line. With this game, you can be a police detective for a day. The game includes a fictitious police department looking to solve a murder plot. You must help them solve the murder by looking for hints and clues about the perpetrator.

In the game, you use Linux commands to navigate through folders and files, searching for clues. First, go to the project’s GitHub repository and download or clone the folder to your device.

When you open the folder labeled clmystery, you will see the files to work with. You can begin with the instructions file that guides you on how to play. They have cheat sheet files showing you Linux commands and how to use them.

If you get stuck, you can look for clues in the hint file. There’s also a solution file if you want to check whether your answer is correct. CLI Murder Mystery teaches a lot about controlling the terminal and managing its processes.

Bandit is one of the Wargames offered by the OverTheWire community. Bandit is for absolute beginners, as it helps you learn Linux by playing around with the interface.

You will learn several Linux commands while trying to solve various challenges. It helps you practice security concepts while playing fun games on the command line. As a beginner, you should start with the basics and advance to level 34.

Bandit helps you get familiar with the command line as you run the game on your device. It’s a great introduction to working with the terminal and Linux code editors and IDEs. To play, you must go to the website and obtain instructions on connecting using SSH (Secure Shell).

The game has different levels. You start at Level 0 and pass it by obtaining a password to access the next level. Each level provides instructions on what to do to finish the level. Without the passwords, you cannot access the next level of the game.

All the levels have a page on the website with commands to win the game. They also provide a detailed explanation of each command and how to use it.

Playing Bandit will ensure you have a good understanding of Linux commands and how to apply them. If you get stuck, you can reach out to their community; they are always eager to help.

With Linux Journey, you will learn everything you need to know about Linux. The site is full of resources for both beginners and advanced learners. The exercises familiarize you with terms, jargon, and phrases used in Linux distributions as well.

You start learning about the origin of Linux and its distributions. Then you explore the command line, user management processes, and Linux security.

The interface has sections with notes and instructions on how to run commands. There’s also a separate interactive shell where you can practice Linux commands. At the end of each lesson, you have a quiz to test your knowledge.

The site is free to use, and there’s no need for sign-ups. All you have to do is navigate to the site and start learning.

Why Learn Linux Using Online Games and Websites?

Linux is one of the most popular technologies used today. This is because of its versatility and numerous career opportunities in the IT field.

It introduces you to opportunities that will help you progress in your IT career. With Linux, you can contribute to open-source projects and collaborate with others. Learning Linux also introduces you to a community of Linux supporters worldwide.

MakeUseOf

Scientists in Japan Develop Experimental Alzheimer’s Vaccine That Shows Promise in Mice

https://i.kinja-img.com/gawker-media/image/upload/c_fill,f_auto,fl_progressive,g_center,h_675,pg_1,q_80,w_1200/6c6ea977db6c4dd1d04bba37a0b2b576.jpg

Scientists in Japan may be at the start of a truly monumental accomplishment: a vaccine that can slow or delay the progression of Alzheimer’s disease. In preliminary research released this week, the vaccine appeared to reduce inflammation and other important biomarkers in the brains of mice with Alzheimer’s-like illness, while also improving their awareness. More research will be needed before this vaccine can be tested in humans, however.

Why Do People Buy into Crypto? | Gizmodo Interview

The experimental vaccine is being developed primarily by scientists from Juntendo University in Japan.

It’s intended to work by training the immune system to go after certain senescent cells, aging cells that no longer divide to make more of themselves, but instead stick around in the body. These cells aren’t necessarily harmful, and some play a vital role in healing and other life functions. But they’ve also been linked to a variety of age-related diseases, including Alzheimer’s. The vaccine specifically targets senescent cells that produce high levels of something called senescence-associated glycoprotein, or SAGP. Other research has suggested that people with Alzheimer’s tend to have brains filled with these cells in particular.

The team tested their vaccine on mice bred to have brains that develop the same sort of gradual destruction seen in humans with Alzheimer’s. This damage is thought to be fueled by the accumulation of a misfolded form of amyloid-beta, a protein. The mice were divided into two groups, with only one group given the actual vaccine.

In the brains of the vaccinated mice, the team found signs of reduced inflammation and fewer amyloid deposits along with lower levels of SAGP-expressing cells. These mice also seemed to behave more like typical mice compared to controls. They continued to exhibit anxiety as they aged, for instance—a trait that tends to fade in people with late-stage Alzheimer’s. They also showed more awareness of their surroundings during maze tests.

The findings were presented over the weekend at the American Heart Association’s Basic Cardiovascular Sciences Scientific Sessions 2023. That means this research hasn’t been formally peer-reviewed yet, so it should be viewed with added caution. At the same time, the team’s vaccine appears to have met an important criteria that many past attempts have failed to reach.

“Earlier studies using different vaccines to treat Alzheimer’s disease in mouse models have been successful in reducing amyloid plaque deposits and inflammatory factors, however, what makes our study different is that our SAGP vaccine also altered the behavior of these mice for the better,” said lead author Chieh-Lun Hsiao, a post-doctoral fellow in the department of cardiovascular biology and medicine at Juntendo University, in a statement released by the American Heart Association.

Of course, mice studies are only the beginning of showing that an experimental drug or vaccine can possibly work as intended. It will take further studies to validate these results and to test the vaccine’s safety in humans before large-scale trials even enter the picture.

But there have been several recent, if modest, successes in Alzheimer’s treatment as of late, and other experimental candidates—including vaccines—are already in clinical trials. With any luck, these newer and upcoming therapies might one day stop Alzheimer’s from being the incurable death sentence that it currently is today.

Gizmodo

Migrating From On-Prem to RDS MySQL/Aurora? DEFINERS Is the Answer

https://www.percona.com/blog/wp-content/uploads/2023/03/ai-cloud-concept-with-robot-arm-1-200×150.jpgMigrating From On-Prem to RDS

Hello friends! If you plan to migrate your database from on-prem servers to RDS (either Aurora or MySQL RDS), you usually don’t have much choice but to do so using logical backups such as mysqldump, mysqlpump, mydumper, or similar. (Actually, you could do a physical backup with Percona XtraBackup to S3, but given that it has not been mentioned at any time which brand —MySQL, Percona Server for MySQL, or MariaDB — or which version —5.5, 5.6 or MariaDB 10.X — is the source, many of those combinations are unsupported for this strategy, so logical backup is the way to go.)

Depending on the size of the instance or the schema to be migrated, we can choose one tool or another to take advantage of the resources of the servers involved and save time.

In this blog, for the sake of simplicity, we are going to use mysqldump, and generate a single table, but the most curious thing is that we are going to create objects which have a certain DEFINER, and it must not be changed.

If you want to create the same lab, you can find it here.

Next, I leave below the list of objects to migrate (the schema is called “migration” and has the following objects):

mysql Source> SELECT *
FROM   (SELECT event_schema AS SCHEMA_NAME,
               event_name   AS OBJECT_NAME,
               definer,
               'EVENT'      AS OBJECT_TYPE
        FROM   information_schema.events
        UNION ALL
        SELECT routine_schema AS SCHEMA_NAME,
               routine_name   AS OBJECT_NAME,
               definer,
               'ROUTINE'      AS OBJECT_TYPE
        FROM   information_schema.routines
        UNION ALL
        SELECT trigger_schema AS SCHEMA_NAME,
               trigger_name   AS OBJECT_NAME,
               definer,
               'TRIGGER'      AS OBJECT_TYPE
        FROM   information_schema.triggers
        UNION ALL
        SELECT table_schema AS SCHEMA_NAME,
               table_name   AS OBJECT_NAME,
               definer,
               'VIEW'       AS OBJECT_TYPE
        FROM   information_schema.views
        UNION ALL
        SELECT table_schema AS SCHEMA_NAME,
               table_name   AS OBJECT_NAME,
               '',
               'TABLE'       AS OBJECT_TYPE
        FROM   information_schema.tables
        Where engine <> 'NULL'
) OBJECTS
WHERE  OBJECTS.SCHEMA_NAME = 'migration'
ORDER  BY 3,
          4;

+-------------+-----------------------+---------+-------------+
| SCHEMA_NAME | OBJECT_NAME           | DEFINER | OBJECT_TYPE |
+-------------+-----------------------+---------+-------------+
| migration   | persons               |         | TABLE       |
| migration   | persons_audit         |         | TABLE       |
| migration   | func_cube             | foo@%   | ROUTINE     |
| migration   | before_persons_update | foo@%   | TRIGGER     |
| migration   | v_persons             | foo@%   | VIEW        |
+-------------+-----------------------+---------+-------------+

5 rows in set (0.01 sec)

That’s right, that’s all we got.

The classic command that is executed for this kind of thing is usually the following:

$ mysqldump --single-transaction&nbsp; -h source-host -u percona -ps3cre3t! migration --routines --triggers --compact --add-drop-table --skip-comments > migration.sql

What is the next logical step to follow in the RDS/Aurora instance (AKA the “Destination”)?

  • Create the necessary users (you can do this using the pt-show-grants tool to extract the users and their permissions).
  • Create the schema “migration.”
  • Import the schema from the command line.

Here we must make a clarification: as you may have noticed, the objects belong to the user “foo,” who is a user of the application, and it is very likely that for security reasons, the client or the interested party does not provide us with the password.

Therefore, as DBAs, we will use the user with all the permissions that AWS allows us to have (unfortunately, AWS does not allow the SUPER permission), which will be a problem that we will show below, which we will solve with absolute certainty.

So, the command to execute the data import would be the following:

$ mysql -h <instance-endpoint> migration -u percona -ps3cre3t! -vv < migration.sql

And this is where the problems begin:

If you want to migrate to a version of RDS MySQL/Aurora 5.7 (which we don’t recommend as the EOL is October 31, 2023!!) you will probably get the following error:

--------------
DROP TABLE IF EXISTS `persons`
--------------

Query OK, 0 rows affected

--------------
/*!40101 SET @saved_cs_client     = @@character_set_client */
--------------

Query OK, 0 rows affected

--------------
/*!50503 SET character_set_client = utf8mb4 */
--------------

Query OK, 0 rows affected

--------------
CREATE TABLE `persons` (
  `PersonID` int NOT NULL,
  `LastName` varchar(255) DEFAULT NULL,
  `FirstName` varchar(255) DEFAULT NULL,
  `Address` varchar(255) DEFAULT NULL,
  `City` varchar(255) DEFAULT NULL,
  PRIMARY KEY (`PersonID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci

... lot of messages/lines

--------------
/*!50003 CREATE*/ /*!50017 DEFINER=`foo`@`%`*/ /*!50003 TRIGGER `before_persons_update` BEFORE UPDATE ON `persons` FOR EACH ROW INSERT INTO persons_audit
 SET PersonID = OLD.PersonID,
     LastName = OLD.LastName,
     City     = OLD.City,
     changedat = NOW() */
--------------

ERROR 1227 (42000) at line 23: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Bye

By the way, do you need help upgrading to MySQL 8.0? Do you need to stay on MySQL 5.7 a bit longer? We will support you either way. Learn more

What does this error mean? Since we are not executing the import (which is nothing more and nothing less than executing a set of queries and SQL commands) with the user “foo,” who is the owner of the objects (see again the define column of the first query shown above), the user “percona” needs special permissions such as SUPER to impersonate and “become” “foo” — but as we mentioned earlier, that permission is not possible in AWS.

So?

Several options are possible; we will list some of them

  • Edit the migration.sql file, and in each definition that there is a DEFINER other than percona, replace it with percona or directly eliminate the DEFINER clause. Pros: it works. Cons: Objects will be executed with the user’s security context “percona” which is not only dangerous but also wrong.
  • Apply the solution that my colleague Sveta proposes here, but you must use mysqlpump. Even so, the migrated objects remain with the DEFINER with which they have been imported.
  • As a last resort, request the password of the user “foo,” which is not always possible.

As you will see, the solution is not simple. I would say complex but not impossible.

Let’s see what happens if the RDS/Aurora version is from the MySQL 8 family. Using the same command to perform the import, this is the output:

--------------
DROP TABLE IF EXISTS `persons`
--------------

Query OK, 0 rows affected

--------------
/*!40101 SET @saved_cs_client     = @@character_set_client */
--------------

Query OK, 0 rows affected

--------------
/*!50503 SET character_set_client = utf8mb4 */
--------------

Query OK, 0 rows affected

--------------
CREATE TABLE `persons` (
  `PersonID` int NOT NULL,
  `LastName` varchar(255) DEFAULT NULL,
  `FirstName` varchar(255) DEFAULT NULL,
  `Address` varchar(255) DEFAULT NULL,
  `City` varchar(255) DEFAULT NULL,
  PRIMARY KEY (`PersonID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci
--------------

Query OK, 0 rows affected

... lot of messages/lines

--------------
/*!50003 CREATE*/ /*!50017 DEFINER=`foo`@`%`*/ /*!50003 TRIGGER `before_persons_update` BEFORE UPDATE ON `persons` FOR EACH ROW INSERT INTO persons_audit
 SET PersonID = OLD.PersonID,
     LastName = OLD.LastName,
     City     = OLD.City,
     changedat = NOW() */
--------------

ERROR 1227 (42000) at line 23: Access denied; you need (at least one of) the SUPER or SET_USER_ID privilege(s) for this operation

Oops! A different message appeared, saying something like, “You need (at least one of) SUPER or SET_USER_ID privileges for this operation.”

Therefore, all we have to do now is assign the following permission to the “percona” user:

mysql Destination> GRANT SET_USER_ID ON *.* TO 'percona';

And bingo! The import finishes without problems. I am going to show you some of the commands that would have continued to fail and worked.

--------------
/*!50003 CREATE*/ /*!50017 DEFINER=`foo`@`%`*/ /*!50003 TRIGGER `before_persons_update` BEFORE UPDATE ON `persons` FOR EACH ROW INSERT INTO persons_audit
 SET PersonID = OLD.PersonID,
     LastName = OLD.LastName,
     City     = OLD.City,
     changedat = NOW() */
--------------

Query OK, 0 rows affected

--------------
CREATE DEFINER=`foo`@`%` FUNCTION `func_cube`(num INT) RETURNS int
    DETERMINISTIC
begin   DECLARE totalcube INT;    SET totalcube = num * num * num;    RETURN totalcube; end
--------------

Query OK, 0 rows affected

Besides that, the objects belong to the user they correspond to (I mean, the DEFINER, the security context).

mysql Destination> SELECT *
FROM   (SELECT event_schema AS SCHEMA_NAME,
               event_name   AS OBJECT_NAME,
               definer,
               'EVENT'      AS OBJECT_TYPE
        FROM   information_schema.events
        UNION ALL
        SELECT routine_schema AS SCHEMA_NAME,
               routine_name   AS OBJECT_NAME,
               definer,
               'ROUTINE'      AS OBJECT_TYPE
        FROM   information_schema.routines
        UNION ALL
        SELECT trigger_schema AS SCHEMA_NAME,
               trigger_name   AS OBJECT_NAME,
               definer,
               'TRIGGER'      AS OBJECT_TYPE
        FROM   information_schema.triggers
        UNION ALL
        SELECT table_schema AS SCHEMA_NAME,
               table_name   AS OBJECT_NAME,
               definer,
               'VIEW'       AS OBJECT_TYPE
        FROM   information_schema.views
        UNION ALL
        SELECT table_schema AS SCHEMA_NAME,
               table_name   AS OBJECT_NAME,
               '',
               'TABLE'       AS OBJECT_TYPE
        FROM   information_schema.tables
        Where engine <> 'NULL'
) OBJECTS
WHERE  OBJECTS.SCHEMA_NAME = 'migration'
ORDER  BY 3,
          4;
+-------------+-----------------------+---------+-------------+
| SCHEMA_NAME | OBJECT_NAME           | DEFINER | OBJECT_TYPE |
+-------------+-----------------------+---------+-------------+
| migration   | persons               |         | TABLE       |
| migration   | persons_audit         |         | TABLE       |
| migration   | func_cube             | foo@%   | ROUTINE     |
| migration   | before_persons_update | foo@%   | TRIGGER     |
| migration   | v_persons             | foo@%   | VIEW        |
+-------------+-----------------------+---------+-------------+
5 rows in set (0.01 sec)

Conclusion

As you can see, there are no more excuses. It is necessary to migrate to MySQL 8. These kinds of small details help make it possible more easily.

A migration of this type is usually always problematic; it requires several iterations in a test environment until everything works really well, and everything can still fail. Now my dear reader, knowing that MySQL 8 solves this problem (as of version 8.0.22), I ask you, what are you waiting for to migrate?

Of course, these kinds of migrations can be complex. But Percona is at your service, and as such, I share Upgrading to MySQL 8: Tools That Can Help from my colleague Arunjith that can guide you so that the necessary migration reaches a good destination.

And remember, you always have the chance to contact us and ask for assistance with any migration.  You can also learn how Percona experts can help you migrate to Percona Server for MySQL seamlessly:

 

Upgrading to MySQL 8.0 with Percona

 

I hope you enjoyed the blog, and see you in the next one!

Percona Database Performance Blog

PHPSandbox – Build, Prototype, and share PHP apps in seconds

https://laravelnews.s3.amazonaws.com/images/phpsandbox.jpg

PHPSandbox is a web app that allows you to quickly prototype or share PHP projects without setting up a local environment.

It’s a pretty neat service because it allows you to test all kinds of things, such as the new “slim skeleton” in Laravel 11, or our Livewire Volt demo app, and even the new Laravel Prompts feature that Jess Archer demoed at Laracon.

Here are some more of the features PHPSandbox includes:

Preview on the go

PHPSandbox automatically provisions a permanent preview URL for your project so you can see your changes instantly.

Comprehensive Environment

Multiple PHP Versions, all PHP extensions you need, and a full-featured Linux environment.

Git and GitHub Integration

Import an existing public composer project from GitHub or Export your projects on PHPSandbox to GitHub.

Composer

The Composer integration allows you to use Composer in your projects while they ensure it keeps working.

Customizable Environment

Configure your environment to your liking. Do you want to change your PHP version or your public directory? No problem.

PHPSandbox is the perfect playground

The JavaScript and CSS ecosystem have had code playgrounds, but this is one of the nicest ones available for PHP. The base plan is free, and they have an upgraded professional plan for $6 a month, including private repos, email captures, and more.

But the base plan works great for quickly testing out packages and making demos of your next tutorial.

Laravel News

Dynamic SQL Workaround in MySQL: Prepared Statements

https://www.percona.com/blog/wp-content/uploads/2023/07/Dynamic-SQL-200×119.jpgDynamic SQL

Dynamic SQL is a desirable feature that allows developers to construct and execute SQL statements dynamically at runtime. While MySQL lacks built-in support for dynamic SQL, this article presents a workaround using prepared statements. We will explore leveraging prepared statements to achieve dynamic query execution, parameterized queries, and dynamic table and column selection.

Understanding prepared statements

Prepared statements refer to the ability to construct SQL statements dynamically at runtime rather than writing them statically in the code. This provides flexibility in manipulating query components, such as table names, column names, conditions, and sorting. The EXECUTE and PREPARE statements are key components for executing dynamic SQL in MySQL.

Example usage: Let’s consider a simple example where we want to construct a dynamic SELECT statement based on a user-defined table name and value:

SET @table_name := 't1';
SET @value := '123';
SET @sql_query := CONCAT('SELECT * FROM ', @table_name, ' WHERE column = ?');

PREPARE dynamic_statement FROM @sql_query;
EXECUTE dynamic_statement USING @value;
DEALLOCATE PREPARE dynamic_statement;

In this example, we use the CONCAT function to construct the dynamic SQL statement. The table name and value are stored in variables and concatenated into the SQL string.

Benefits and features

  • Prepared statements can be used both as a standalone SQL statement and inside stored procedures, providing flexibility in different contexts.
  • Support for Various SQL Statements: SQL statements can be executed using prepared statements, including statements like DROP DATABASE, TRUNCATE TABLE, FLUSH TABLES, and KILL. This allows for dynamic execution of diverse operations.
  • Usage of Stored Procedure Variables: Stored procedure variables can be incorporated into the dynamic expression, enabling dynamic SQL based on runtime values.

Let’s look at another scenario:

Killing queries for a specific user:

CREATE PROCEDURE kill_all_for_user(user_connection_id INT)
BEGIN
  SET @sql_statement := CONCAT('KILL ', user_connection_id);
  PREPARE dynamic_statement FROM @sql_statement;
  EXECUTE dynamic_statement;
END;

In this case, the prepared statement is used to dynamically construct the KILL statement to terminate all queries associated with a specific user.

Conclusion

You might use prepared statements to make dynamic queries, but dynamic queries can definitely make debugging more challenging. You should consider implementing some additional testing and error handling to help mitigate this issue. That could help you catch any issues with the dynamic queries early on in the development process.

Percona Monitoring and Management is a best-of-breed open source database monitoring solution. It helps you reduce complexity, optimize performance, and improve the security of your business-critical database environments, no matter where they are located or deployed.

 

Download Percona Monitoring and Management Today

Planet MySQL