Listen To Stephen Fry Perfectly Analogize The Moral Panics Around Facebook To The Ones Over The Printing Press

So I’m a bit late to this, as Stephen Fry released a podcast "documentary" entitle Great Leap Years a few months back. I’ve just started listening to it recently, and it hits on so many of the points and ideas that I’ve tried to address here on Techdirt over the course of the past 20 years, but does so much more brilliantly than anything I’ve done in those ~70,000 posts. That is, in short, if you like what we write about here concerning the nature of innovation and technology, I highly recommend the podcast, after having just listened to the first two episodes.

And just to give you a sense of this, I’m going to quote a bit from near the end of the 2nd podcast. This isn’t revealing any spoilers, and the storytelling is so wonderful that you really ought to listen to the whole thing. But this so perfectly encapsulates many of my thoughts about why people freaking out about "bad stuff" happening on Facebook, Twitter, YouTube and more are in the midst of a a moral panic not unlike those we’ve seen before. None of this is to say that we should ignore the "bad stuff" that is happening, or try to minimize it. But it does suggest that we take a broader perspective and recognize that, maybe, this is the way humans are, and it’s not "this new technology" that’s to blame.

The episode itself is about the invention of the printing press by Johannes Gutenberg (which also wonderfully works in some details about Gutenberg’s real name that I had not known). And after going through the details of Gutenberg and his invention, discusses how the Catholic Church was initially overjoyed at the invention, noting that it could print and sell indulgences faster (which is an important call back to the 1st episode…). There’s a brief discussion of how the Church suddenly realizes its "mistake" and tries to fight back, and then this:

All kinds of bad people saw the opportunity to harness the power of the printed word for their own ends. Ends that could result in burnings, massacres, and wars. The speed of the transmission of information accelerated everything.

You might say that the medieval world had been like one of those sluggish hormonally slowed down catatonic patients in Olver Sachs’ book, Awakenings, later made into a film with Robert DeNiro and Robin Williams. Encephalitis lethargica was their affliction. Statue-like, motionless, with low body temperature, slow heart rate, zombie-like lethargy and stillness, they lived almost dormant lives. Sachs saw one such patient with just this disease, who was otherwise perfectly healthy, save for a small tumor in his tummy. Sachs injected his magical L-DOPA serum, and the man swiftly woke from his torpor, totally restored. Smiling, walking, remembering. Fully awake and alive. Everything back to speed… including his tumor. He was dead within two months. Killed by the stomach cancer which had awoken from its dormancy with the rest of him.

You might regard Europe as having been in just such a torpid state. The arrival of printing was like an injection of life-giving serum into Europe. It awoke and energized the world. But aggravated all kinds of cancers of tribalism, sectarianism, and rivalry too. In a manner all too familiar to us in our day, a cultural, intellectual, ideological and doctrinal chasm opened up in Europe. Culture wars that foreshadowed our own broke out. The Muslim world banned printing of Arabic or of Islamic texts. For centuries, Jews were banned from the printing trade and Christian countries forbade the printing of Hebrew texts. Propaganda took off. Edicts and attacks on Protestantism flew from Catholic presses, and vice versa.

As the historian Nile Ferguson argues in his book The Square and the Tower: The invention of movable type printing and the unleashing of what is known as the Gutenberg Revolution, created social networks in which two sides countered each other with misinformation (fake news, as we would have it now), the vicious abuse, and (as in our time) all without supervision or a locus of recognizable authority. A free-for-all raging outside of what had previously been structured hierarchies. Because anyone could use the invention, all kinds of bad actors and malevolent hustlers did use it.

Technologies like printing, or any other information technologies that have followed in its wake, are essentially neutral, have no moral valency, no inner directive in and of themselves to act either for good or ill. Indulgences could be printed, and broadsides attacking the corruption of indulgences could be printed just as easily. Das Kapital or Mein Kampf. It’s all the same to the type, the paper, and the platten. The Declaration of Independence or The Protocols of the Elders of Zion? The sonnets of Shakespeare or the thoughts of Chairman Mao? Collections of recipes for cake making or collections of recipes for bomb making.

All this is familiar to us, we who mourn the swift death of the Utopian ideals promised by the internet and social media. The letter types in their boxes could seem like the evil spirits that flew from Pandora’s box and released strife, starvation, war, and wickedness into the world.

I’ve, perhaps, now gone too far the other way. After all, impulses and new ways of thinking and exchanging ideas that were benevolent flourished too. To depict the Gutenberg Revolution as causing a human disaster is as sentimental and over-simplified as seeing it as having ushered in a golden age of open thought and perfect freedoms. Or, as regarding early humans, moving from hunter gatherers to agriculturalists as catastrophic. Or, looking on social networks and media as wholly calamitous.

Part of what this series of podcasts is aiming to do is to come to terms with the inevitability of… let’s call it "change." Progress may be regarded as too freighted a word. Change, transformation, mutation, cultural evolution. These are our weather systems. Our historical and future landscapes were and our shaped by these processes, just as our geographical landscapes are shaped by the action of water and weather. To believe that we should or could halt them, or to waste time mourning their existential alterations to our ways of living is, to put it crudely, to piss into the wind. The movable type revolution was necessary and never a genie that any sane person would want to be forced back into its bottle.

Yes, cancers may have woken up in Europe at the same time as a new life surged through its bloodstream. But surely better a quick hot life, however cut short, than a permanent frozen nothingness, a catatonic zombie nullity.

The key is not to bemoan or to overpraise change, but to attempt as best we may to know all we can about the transformative nature of our leaps of innovation and to understand them. For today, changes are coming that will dwarf the revolutions in information technology with which we are familiar. It has never been more important, in my view, to be armed with knowledge and understanding of our past in order to confront our future with anything like confidence.

There’s more and you should listen to the whole thing — but this is a succinct and brilliantly described viewpoint that I’ve long shared about technology and innovation. Going back all the way to the copyright debates that we had on this site from the earliest days, the key point that I kept raising over and over again is that fighting over the claims that infringement is somehow "bad" totally miss the point. It is happening. And if it is happening, bemoaning that it was undermining traditional business models (that had their own problems for culture, free speech and, importantly, for artists themselves) was a silly waste of time. Wouldn’t we have been better served looking to understand what new things were being enabled, and how those might be used to encourage more creativity and innovation.

And, of course, now we’re having similar fights and discussions (as Fry clearly notes) about social media and the internet. And I’m sure there will be others — perhaps about artificial intelligence or 3d printing or blockchain or satellites and space travel. Many of those debates have already started. And, as new technologies and innovations come about there will be more to debate and to understand.

But if the default is to start from the position that anything bad created by these new technologies condemns the technologies themselves, we will lose out. Not necessarily on the technologies themselves — as those seem to have a way of advancing — but on the ability to harness those technologies in the most useful and most fruitful ways. If we fear the transformations or focus solely on what will most prevent the "bad" or bring back the world that used to be, we will undoubtedly lose out on many of the many good things that come along as well.

This is the key point that Fry so nicely puts forth in the two episodes I’ve listened to so far. Change is happening and it has both good and bad consequences. No one should deny that. Focusing solely on one side, rather than the other, doesn’t change any of that, but can create a lot of wasted time and effort. Instead, understanding the nature of that change, looking for ways to encourage more of the good, while discouraging the bad, is a reasonable path forward, but that has to come through understanding what’s happening and recognizing that it is an impossible and pointless task to seek to remove or prevent all of the bad.

So many of the technological fights we talk about today over copyright, patents, encryption, the future of work, surveillance, and more often seem to stem from legacy operations which had a handle on things in the past that they no longer have a handle on today. But rather than looking for reasonable paths forward that preserve the good new things, they focus on eradicating the bad — which is not just an impossible and fruitless plan, but one that will create significantly more negative consequences (intended or not).

Fry’s podcast is great in providing some more historical perspective on this, but has also helped me better frame the work that we’ve tried to do here on Techdirt over the past two decades, and which we’ll hopefully continue for many more.

Permalink | Comments | Email This Story

via Techdirt
Listen To Stephen Fry Perfectly Analogize The Moral Panics Around Facebook To The Ones Over The Printing Press

Auditing MariaDB for Secured Database Infrastructure Operations

When you are building Database Infrastructure for an data sensitive business (like financial services, digital commerce, advertising media solutions, healthcare etc. ) governed by compliance and policies, You are expected to maintain the audit log of the transactions to investigate, if you ever suspect something unacceptable (i.e., user updating / deleting data) happening to your database . MariaDB provides Audit Plugin (MariaDB started including by default the Audit Plugin from versions 10.0.10 and 5.5.37, and it can be installed in any version from MariaDB 5.5.20.) to log the server activity, Although the MariaDB Audit Plugin has some unique features available only for MariaDB, it can be used also with MySQL. MariaDB Audit Plugin log the details like who connected to server (i.e., username and host), what queries were executed, the tables accessed and server variables changed. This information is retained in a rotating log file or sent to local syslogd. This blog is a fully hands-on guide to “Auditing MariaDB for Secured Database Infrastructure Operations”.

MariaDB Audit Plugin installation

The MariaDB Audit Plugin is provided as a dynamic library: server_audit.so (server_audit.dll for Windows). The file path of the plugin library is stored in the plugin_dir system variable:

MariaDB [(none)]> select @@plugin_dir; 
+--------------------------+
| @@plugin_dir             |
+--------------------------+
| /usr/lib64/mysql/plugin/ |
+--------------------------+
1 row in set (0.000 sec)

One way to install this plug-in is to execute the INSTALL SONAME statement while logged into MariaDB. You must use an administrative account with INSERT privilege for the mysql.plugin table:

MariaDB [(none)]> INSTALL SONAME 'server_audit';

Loading Plugin at Start-Up

You can also load the plugin from the command-line as a startup parameter by configuring my.cnf or my.ini in /etc/my.cnf or /etc/mysql/my.cnf , We have copied below the configuration of my.cnf for enabling MariaDB Audit Plugin (please add these variables after [mysqld] or [mariadb] ):

plugin_load=server_audit=server_audit.so

server_audit_events=CONNECT,QUERY,TABLE

server_audit_logging=ON

server_audit=FORCE_PLUS_PERMANENT

We don’t want somebody uninstall MariaDB Audit Plugin so enabled system variable, server_audit=FORCE_PLUS_PERMANENT , The example below explains this scenario much better:

MariaDB [(none)]> UNINSTALL PLUGIN server_audit;
ERROR 1702 (HY000): Plugin 'server_audit' is force_plus_permanent and can not be unloaded

To see the list of audit plugin-related variables in your MariaDB server, execute the command below:

MariaDB [(none)]> SHOW GLOBAL VARIABLES LIKE 'server_audit%';
+-------------------------------+-----------------------+
| Variable_name                 | Value                 |
+-------------------------------+-----------------------+
| server_audit_events           | CONNECT,QUERY,TABLE   |
| server_audit_excl_users       |                       |
| server_audit_file_path        | server_audit.log      |
| server_audit_file_rotate_now  | OFF                   |
| server_audit_file_rotate_size | 1000000               |
| server_audit_file_rotations   | 9                     |
| server_audit_incl_users       |                       |
| server_audit_logging          | ON                    |
| server_audit_mode             | 0                     |
| server_audit_output_type      | file                  |
| server_audit_query_log_limit  | 1024                  |
| server_audit_syslog_facility  | LOG_USER              |
| server_audit_syslog_ident     | mysql-server_auditing |
| server_audit_syslog_info      |                       |
| server_audit_syslog_priority  | LOG_INFO              |
+-------------------------------+-----------------------+
15 rows in set (0.002 sec)

Uncontrolled MariaDB Audit Plugins are major concerns in any MariaDB database infrastructure operations, I strongly recommend our customers to consider log rotate “server_audit.log” file, You can force a rotation by enabling the server_audit_file_rotate_now :

MariaDB [(none)]> SET GLOBAL server_audit_file_rotate_now = ON;
Query OK, 0 rows affected (0.015 sec)

You can configure the size limit of MariaDB Audit Plugin by setting variable, server_audit_file_rotate_size . To limit the number of log files created, set the variable, server_audit_file_rotations. To force log file rotations you can set the variable, server_audit_file_rotate_now to ON:

[mariadb]
..
server_audit_file_rotate_now=ON
server_audit_file_rotate_size=1000000
server_audit_file_rotations=10
...

MariaDB Audit Plugin report:

[root@localhost mysql]# tail -f server_audit.log
20180720 20:39:22,localhost.localdomain,root,localhost,13,1501,QUERY,,'SELECT DATABASE()',0
20180720 20:39:22,localhost.localdomain,root,localhost,13,1503,QUERY,sakila,'show databases',0
20180720 20:39:22,localhost.localdomain,root,localhost,13,1504,QUERY,sakila,'show tables',0
20180720 20:39:27,localhost.localdomain,root,localhost,13,1528,QUERY,sakila,'show tables',0
20180720 20:39:43,localhost.localdomain,root,localhost,13,1529,READ,sakila,customer,
20180720 20:39:43,localhost.localdomain,root,localhost,13,1529,QUERY,sakila,'select * from customer limit 100',0
20180720 20:39:52,localhost.localdomain,root,localhost,13,1530,QUERY,sakila,'show tables',0
20180720 20:40:07,localhost.localdomain,root,localhost,13,1531,READ,sakila,actor,
20180720 20:40:07,localhost.localdomain,root,localhost,13,1531,QUERY,sakila,'select * from actor limit 100',0
20180720 20:40:30,localhost.localdomain,root,localhost,13,0,DISCONNECT,sakila,,0

Conclusion

We recommend most of our customers (using MariaDB) to enable MariaDB Audit Plugin to closely monitor what is happening to their database infrastructure, This really helps to proactively troubleshoot if anything going wrong with their MariaDB operations. Reliable and secured database operations is equally important like performance and scalability.

The post Auditing MariaDB for Secured Database Infrastructure Operations appeared first on MySQL Consulting, Support and Remote DBA Services.

via Planet MySQL
Auditing MariaDB for Secured Database Infrastructure Operations

Ask Slashdot: Should I Ditch PHP?

Long-time Slashdot reader Qbertino does PHP for a living, but says he’s growing "increasingly frustrated with the ignorant and clueless in the vincinity of PHP."
Crappy code and baaaaad application setups is one thing, but people refusing to fix them or simply not even understanding the broader implications of bad applications or attempting SEO with gadgets while refusing to fix 3.5 MB-per-pagecall are just minor tidbits in a history of increasingly unnerving run-ins with knuckledragers in the "web agency" camp… Will I leave the larger part of this backwards stuff behind if I move to another server-side programming language such as Java or Kotlin for professional work in the broader web area? Do I have a chance to do quality work on quality projects using PHP, or are those slim compare to other programming languages? In short, should I ditch PHP? "I think .NET is a much cleaner language to work in with Microsoft’s excellent Visual Studio IDE and debugger," argues Slashdot reader Agret , adding "there are many large projects in my city hiring .NET developers and being a strongly typed language the code quality is generally better than PHP." But what’s been your experience? And would a frustrated developer find more quality projects by ditching PHP?



Share on Google+

Read more of this story at Slashdot.

via Slashdot
Ask Slashdot: Should I Ditch PHP?

The First Trailer for Aquaman Is as Epic as It Is Moist


All out fish-war descends upon Atlantis in the Aquaman trailer.
GIF: Aquaman (Warner Bros.)

And it’s very moist.

If there was any question that longtime Fast & Furious director James Wan was going to make something truly massive and epic with Aquaman, those doubts can be put to bed…the seabed.

Inside Hall H at San Diego Comic-Con Saturday, Wan presented the first trailer from the highly anticipated DC movie and brought the house down. Check it out for yourself:

Jason Momoa stars as the titular hero in the film, alongside Amber Heard as the hero Mera, Patrick Wilson as Ocean Master, along with Willem Dafoe, Nicole Kidman, Yahya Abdul-Mateen, and others. It opens December 21.


via Gizmodo
The First Trailer for Aquaman Is as Epic as It Is Moist

The First Shazam! Trailer Finally Lets the DC Universe Have Some Fun


After Batman v Superman, Suicide Squad, and Justice League, all of which exist in the same universe, Shazam! looks like it’s going to be a breath of fresh air for the DC Extended Universe. Very happy, very silly fresh air.

If you’re unfamiliar with the classic superhero and/or not been following io9’s coverage, Shazam the superhero is, in fact, a young boy named Billy Batson (played by Asher Angel), who transforms into an adult superhero (Chuck’s Zachary Levi) when he yells “Shazam!” As the trailer shows, Billy is totally down with his new powers—and his new age.

Directed by David F. Sandberg, starring Zachary Levi and Shazam also stars Mark Strong as the villain Dr. Silva, Djimon Hounsou as the Wizard that grants Billy his powers, and more.

Shazam! shazams it’s way into theaters on April 5, 2019.


via Gizmodo
The First Shazam! Trailer Finally Lets the DC Universe Have Some Fun

When Should I Use Amazon Aurora and When Should I use RDS MySQL?


Now that Database-as-a-service (DBaaS) is in high demand, there is one question regarding AWS services that cannot always be answered easily : When should I use Aurora and when RDS MySQL?

DBaaS cloud services allow users to use databases without configuring physical hardware and infrastructure, and without installing software. I’m not sure if there is a straightforward answer, but when trying to find out which solution best fits an organization there are multiple factors that should be taken into consideration. These may be performance, high availability, operational cost, management, capacity planning, scalability, security, monitoring, etc.

There are also cases where although the workload and operational needs seem to best fit to one solution, there are other limiting factors which may be blockers (or at least need special handling).

In this blog post, I will try to provide some general rules of thumb but let’s first try to give a short description of these products.

What we should really compare is the MySQL and Aurora database engines provided by Amazon RDS.

An introduction to Amazon RDS

Amazon Relational Database Service (Amazon RDS) is a hosted database service which provides multiple database products to choose from, including Aurora, PostgreSQL, MySQL, MariaDB, Oracle, and Microsoft SQL Server. We will focus on MySQL and Aurora.

With regards to systems administration, both solutions are time-saving. You get an environment ready to deploy your application and if there are no dedicated DBAs, RDS gives you great flexibility for operations like upgrades or backups. For both products, Amazon applies required updates and the latest patches without any downtime. You can define maintenance windows and automated patching (if enabled) will occur within them. Data is continuously backed up to S3 in real time, with no performance impact. This eliminates the need for backup windows and other, complex or not, scripted procedures. Although this sounds great, the risk of vendor lock-in and the challenges of enforced updates and client-side optimizations are still there.

So, Aurora or RDS MySQL?

Amazon Aurora is a relational, proprietary, closed-source database engine, with all that that implies.

RDS MySQL is 5.5, 5.6 and 5.7 compatible and offers the option to select among minor releases. While RDS MySQL supports multiple storage engines with varying capabilities, not all of them are optimized for crash recovery and data durability. Until recently, it was a limitation that Aurora was only compatible with MySQL 5.6 but it’s now compatible with both 5.6 and 5.7 too.

So, in most cases, no significant application changes are required for either product. Keep in mind that certain MySQL features like the MyISAM storage engine are not available with Amazon Aurora. Migration to RDS can be performed using Percona XtraBackup.

For RDS products shell access to the underlying operating system is disabled and access to MySQL user accounts with the “SUPER” privilege isn’t allowed. To configure MySQL variables or manage users, Amazon RDS provides specific parameter groups, APIs and other special system procedures which be used. If you need to enable remote access this article will help you do so https://www.percona.com/blog/2018/05/08/how-to-enable-amazon-rds-remote-access/

Performance considerations

Although Amazon RDS uses SSDs to achieve better IO throughput for all its database services, Amazon claims that the Aurora is able to achieve a 5x performance boost than standard MySQL and provides reliability out of the box. In general, Aurora seems to be faster, but not always.

For example, due to the need to disable the InnoDB change buffer for Aurora (this is one of the keys for the distributed storage engine), and that updates to secondary indexes must be write through, there is a big performance penalty in workloads where heavy writes that update secondary indexes are performed. This is because of the way MySQL relies on the change buffer to defer and merge secondary index updates. If your application performs a high rate of updates against tables with secondary indexes, Aurora performance may be poor. In any case, you should always keep in mind that performance depends on schema design. Before taking the decision to migrate, performance should be evaluated against an application specific workload. Doing extensive benchmarks will be the subject of a future blog post.

Capacity Planning

Talking about underlying storage, another important thing to take into consideration is that with Aurora there is no need for capacity planning. Aurora storage will automatically grow, from the minimum of 10 GB up to 64 TiB, in 10 GB increments, with no impact on database performance. The table size limit is only constrained by the size of the Aurora cluster volume, which has a maximum of 64 tebibytes (TiB). As a result, the maximum table size for a table in an Aurora database is 64 TiB. For RDS MySQL, the maximum provisioned storage limit constrains the size of a table to a maximum size of 16 TB when using InnoDB file-per-table tablespaces.

Replication

Replication is a really powerful feature of MySQL (like) products. With Aurora, you can provision up to fifteen replicas compared to just five in RDS MySQL. All Aurora replicas share the same underlying volume with the primary instance and this means that replication can be performed in milliseconds as updates made by the primary instance are instantly available to all Aurora replicas. Failover is automatic with no data loss on Amazon Aurora whereas the replicas failover priority can be set.

An explanatory description of Amazon Aurora’s architecture can be found in Vadim’s post written a couple of years ago https://www.percona.com/blog/2015/11/16/amazon-aurora-looking-deeper/

The architecture used and the way that replication works on both products shows a really significant difference between them. Aurora is a High Availablity (HA) solution where you only need to attach a reader and this automatically becomes Multi-AZ available. Aurora replicates data to six storage nodes in Multi-AZs to withstand the loss of an entire AZ (Availability Zone) or two storage nodes without any availability impact to the client’s applications.

On the other hand, RDS MySQL allows only up to five replicas and the replication process is slower than Aurora. Failover is a manual process and may result in last-minute data loss. RDS for MySQL is not an HA solution, so you have to mark the master as Multi-AZ and attach the endpoints.

Monitoring

Both products can be monitored with a variety of monitoring tools. You can enable automated monitoring and you can define the log types to publish to Amazon CloudWatch. Percona Monitoring and Management (PMM) can also be used to gather metrics.

Be aware that for Aurora there is a limitation for the T2 instances such that Performance Schema can cause the host to run out of memory if enabled.

Costs

Aurora instances will cost you ~20% more than RDS MySQL. If you create Aurora read replicas then the cost of your Aurora cluster will double. Aurora is only available on certain RDS instance sizes. Instances pricing details can be found here and here.

Storage pricing may be a bit tricky. Keep in mind that pricing for Aurora differs to that for RDS MySQL. For RDS MySQL you have to select the type and size for the EBS volume, and you have to be sure that provisioned EBS IOPs can be supported by your instance type as EBS IOPs are restricted by the instance type capabilities. Unless you watch for this, you may end up having EBS IOPs that cannot be really used by your instance.

For Aurora, IOPs are only limited by the instance type. This means that if you want to increase IOPs performance on Aurora you should proceed with an instance type upgrade. In any case, Amazon will charge you based on the dataset size and the requests per second.

That said, although for Aurora you pay only for the data you really use in 10GB increments if you want high performance you have to select the correct instance. For Aurora, regardless of the instance type, you get billed $0.10 per GB-month and $0.20 per 1 million requests so if you need high performance the cost maybe even more than RDS MySQL. For RDS MySQL storage costs are based on the EBS type and size.

Percona provides support for RDS services and you might be interested in these cases studies:

When a more fully customized solution is required, most of our customers usually prefer the use of AWS EC2 instances supported by our managed services offering.

TL;DR
  • If you are looking for a native HA solution then you should use Aurora
  • For a read-intensive workload within an HA environment, Aurora is a perfect match. Combined with ProxySQL for RDS you can get a high flexibility
  • Aurora performance is great but is not as much as expected for write-intensive workloads when secondary indexes exist. In any case, you should benchmark both RDS MySQL and Aurora before taking the decision to migrate.  Performance depends much on workload and schema design
  • By choosing Amazon Aurora you are fully dependent on Amazon for bug fixes or upgrades
  • If you need to use MySQL plugins you should use RDS MySQL
  • Aurora only supports InnoDB. If you need other engines i.e. MyISAM, RDS MySQL is the only option
  • With RDS MySQL you can use specific MySQL releases
  • Aurora is not included in the AWS free-tier and costs a bit more than RDS MySQL. If you only need a managed solution to deploy services in a less expensive way and out of the box availability is not your main concern, RDS MySQL is what you need
  • If for any reason Performance Schema must be ON, you should not enable this on Amazon Aurora MySQL T2 instances. With the Performance Schema enabled, the T2 instance may run out of memory
  • For both products, you should carefully examine the known issues and limitations listed here https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/MySQL.KnownIssuesAndLimitations.html and here https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Aurora.AuroraMySQL.html

Related

Ananias Tsalouchidis

Ananias has joined Percona as a Senior MySQL DBA on May 2017. He holds a BSc and a MSc in computer science and has a 10+ years working experience as a systems and databases administrator. He loves databases and perl scripting. He has worked for some big Greek companies and academic institutions and has also been involved in numerous research programs.
He is located in Thessaloniki, Greece, with his wife and their children.


via MySQL Performance Blog
When Should I Use Amazon Aurora and When Should I use RDS MySQL?

.22LR Ammunition Shootout: Two Rifles, 17 Brands


By Thundervoice

The first gun I ever shot was a .22LR rifle. I remember that day well, as my Dad took me to a range when I as about nine and taught me how to shoot a Remington Model 511 that he acquired as a kid. It was a rifle which I still have and will pass on to my son someday. While I don’t shoot that rifle often, it is probably my favorite firearm.

Josh Wayner’s TTAG piece on 22LR reminded me of my stash of 22 ammo waiting to be used. I have enough 22 ammo now to be picky about what I buy in the future. The TTAG Summer Content Contest was just enough impetus for me to knock something off my “to do” list – have a “.22 shoot-off” competition so that I can make more informed decisions of which type of 22 ammo I should buy the next time I feel like restocking my stash.

The premise of my shoot-off was simple. I wanted to shoot each type of .22 ammo that I have in my stash to see which is the most accurate with two different rifles. As my sentimental favorite firearm, there was no doubt that the Remington Model 511 would be one of the rifles. That rifle has a 25 inch barrel, 16:1 twist, and open sights.

Since Josh said that the “Ruger 10/22 rifle is probably the most popular single 22 rifle ever made,” I figured that would be a good choice for the second rifle. My 10/22 has a 20 inch barrel, 15:1 twist, and a Bushnell TRS-25 red dot sight.

I decided to shoot 10 rounds of each ammo type with each rifle, using a dedicated target for each rifle and ammo combination. With 17 different types of ammo, that made for 34 targets, 340 rounds, and about 2.75 hours of shooting on a hot Texas Saturday at the range.

While I can typically shoot a six-inch or better group at 100 yards with the Model 511, I elected to shoot the rifles at 25 yards to keep the group sizes smaller and make it easier to use the open sights on the Model 511. At almost 60 years of age, I’ve had to go to monovision with my contacts, which makes shooting with open sights more challenging that it was in my younger days.

 

I shot from a sitting position using a support for the forestock. I would shoot 10 rouinds of a given type with the Ruger, then 10 rounds of the same type with the Remington. I would then reload the magazines of each with the next ammo type, then pull a BoreSnake through each barrel before starting the next round of shooting.

The cycle for a given ammo type was about seven minutes except when I had to reset targets. This rate kept the barrels from getting too hot, even with an ambient temperature that started at 90 degrees and climbed from there. I could set targets for six ammo types each trip downrange, which meant I had to go downrange three times. I shot 10 rounds through each rifle before starting with the actual shoot-off so that the barrels would not be cold for the first round.

The ammo used in the experiment was purchased at various times from various locations at various prices over the last 3.5 years. Most came from either Wal-Mart, Academy, or Gander Mountain (especially during their liquidation sale). I decided to randomize the shooting order so as to not favor any particular ammo type.

The table below shows the ammo types listed in the order in which they were shot, along with advertised velocity, bullet weight, cost/round, and the 9- and 10-shot group sizes.

A few notes about the information in this table. The ammunition types listed are not intended to be an all-inclusive list of .22LR ammo, it’s just want I had on hand. The bullet type indicated the weight in grains and two other features: 1) whether it was copper plated (CP) or lead (L) and 2) round nose (RN) or hollow point (HP). The Remington golden bullets are brass plated (BP). The Winchester M-22 bullets are black copper plated. The velocity was taken from the packaging or looked up on the manufacturer’s website.

The cost per round was determined by selecting the lowest costs from websites or my local Wal-Mart on July 13, 2018. I checked prices on the Academy Sporting Goods, Ammunition Depot, Bud’s Gun Shop, Cabelas, and MidwayUSA websites. Bud’s was the closest to having all of the ammo types on my list and generally had the lowest prices. Packaging quantities (rounds/box) varied from 40 round to 1000 round boxes as shown with the cartridge type in the table and reflects both what I shot and the basis for the cost/round. The 10-shot spread is the largest distance between any 2 of the 10 shots.

The trigger pull on the Model 511 is noticeably lighter than the 10/22 and that resulted in at least two flyers, so I used the nine-shot spread for evaluating the relative accuracy of the different types of ammo. These results indicate that the bolt action rifle with the longer barrel (Remington Model 511) was generally more accurate than the Ruger 10/22, based on the average group size of the 9-shot string, although it is worth noting that the smallest groups were shot with the Ruger.

There were no failure-to-fire, failure-to-feed, or failure-to-eject rounds with any of the 340 rounds shot. This is most likely due to shooting only 10 rounds of each ammo type in the semi-auto rifle. In my .22 steel competition experience, it’s not unusual for me to experience one or two FTFs during a competition (around 125-150 rounds). This varies depending upon the ammo used. While I have my preferences to avoid FTFs, I don’t have hard data and don’t want to put anecdotal data in print.

The second table shows the results sorted by group size (nine-shot spread in inches) for each rifle. Only two of the nine-shot groups were over two inches (barely) and only eight were between 1.5 and 2 inches. The other 24 nine-shot groups were all less than 1.5 inches (5.75 minutes of angle).

As can be seen in the second table, there are differences in the order of the ammo types. The CCI AR Tactical, RWS Target Rifle, and CCI Mini-Mag HP were in the top five of both groups.

The photo of the targets shows the results for these three ammo types for both rifles. The grid shown on these targets is one inch.

There were also some significant differences in the performance order. For example, the Remington M-22, which is the round I have been using in .22 steel competitions lately, was in the top five for the 511 but next to last with the 10/22 and the WWB ammo had a similar relationship. The Browning Performance Rimfire was at the top with the Ruger 10/22, but near the bottom of the list for the Remington 511. I’m guessing that the inconsistency in these rankings is partly due to inconsistency of the human shooter (me).

Although not presented here, I also sorted the results by the 10-shot spread. The order changed quite a bit for a few of the ammo types.

For the Ruger, the average difference between the 9- and 10-shot groups was about a quarter-inch, with a maximum difference of about a half-inch. In comparison, the average difference for the Remington 511 was about a half-inch, with a maximum of 1.6 inches.

The larger differences with the Remington are due to two ammo types (Browning Performance Rimfire and Remington Golden Bullet 40gr) where I shot a flyer with the 511 rifle (most likely due to the lighter trigger on the 511). If I delete those ammo types from the statistics, the averages and maximums for each rifle are close.

One of the more accurate rounds was the RWS Target Rifle. This ammo is German-made and costs almost 15 cents/round (top targets in photo). If I remember correctly, I bought a handful of boxes at Gander Mountain during their liquidation sale without any idea whether it was worth buying. On the other hand, the most expensive ammo, Federal Gold Medal Ultra Match, at 16 cents/round, did not produce as small a group as several of the other ammo types.

So what did I learn from this experience?

  • Perhaps the most important is that a one-inch group at 25 yards is almost four minutes of angle. This level of accuracy is laughable for most rifles. I’m confident that I could shoot tighter groups with my rifles if I used a steadier support, including support near the rear of the rifle. The4 MOA I got at 25 yards is better than the six-inch group I often shoot at 100 yards (6 MOA). While 3 MOA accuracy is nothing to brag about, remember that we are talking about 22 ammo and unmodified rifles.
  • Now that I have done this once, I’ll repeat it again in the future, although maybe not with all 17 types in one sitting. The next time I do this, I want to borrow a lead sled or a similar rest to see how much difference that makes.
  • I did not report this in the results, but there were some significant differences in the point of impact between the different types of ammo. This is likely due to the different velocities and bullet weights (duh!) but the relationship between POI and velocity/weight were not obvious when I analyzed the data. The differences between POI might also be related to differences between advertised velocity and the actual velocity out of a given barrel. Given that there was a five-inch difference in barrel lengths, I would expect the velocities with each rifle to be different. I did not measure velocity but that would be good information for someone to add to this database if anyone wants to take on that effort.
  • There were some fairly significant differences in the groups with each rifle. This may be due to the fact that I shot a little faster with the 10/22 due to it being a semi-auto and that the red dot made sighting a little faster. It is also worth noting that the red dot for the Bushnell sight is a 3 MOA dot so the average group size for the 10/22 is only about a half inch larger than the size of the red dot.
  • It took almost three hours of shooting to go through all 17 types of ammo. That’s a lot of time hunched over a couple of rifles. I felt like I was not shooting as well toward the end, but when I look at the results, several of the larger groups were in the top five types of ammo in the evaluation. So it may be that I hadn’t warmed up at the beginning and that the ammo types shot at the beginning would have better results if they had been shot later in the day.
  • The signs of variability that I have mentioned above indicate that there is more work to be done to identify the best cartridge for a given firearm. With this information as a base, I can keep better notes the next time I do some precision shooting with either of these rifles.
  • I shoot in 22 steel competitions on occasion and enjoy shooting my 22 at the range on occasion. I don’t shoot my 22 as much as I used to now that I reload my other calibers, but shooting the 22 is still a lot of fun, especially trying to get a small group at 100 yards. Over the last few years, I have noticed differences in accuracy between cheaper plinking ammo and more expensive target ammo. I have also noticed failure-to-fire events during 22 steel competitions with some of the ammo types I use. I need to keep better notes on the rounds used and FTF in future competitions.
  • For the most part, I can continue to use the same ammo for the steel competition in my Ruger 10/22. Those are generally the CCI AR Tactical, CCI Mini-Mag HP and Remington Golden Bullet 36 pt HP. These competitions are timed events shooting at large steel targets, so speed is more important than precise accuracy.
  • The love for my Remington Model 511 was reaffirmed. I’m looking forward to using it next month in a long-range rimfire silhouette contest (not a speed event). With the information from my experiment, I know I’ll be using the remaining stock of my RWS ammo for that competition.

 


via The Truth About Guns
.22LR Ammunition Shootout: Two Rifles, 17 Brands

Introducing Docker Application Guides

In April of 2017 we announced the Modernize Traditional Applications (MTA) program at Docker. The goal of MTA is to take the vast back catalogs of existing applications that are running in enterprise organizations today, and bring them to a modern container platform, without requiring extensive rewrites or refactoring. I’m excited to share part of our learning from the MTA program and announce the release of Docker Application Guides.

 

Oracle WebLogic MedRec Sample Application on Docker Enterprise Edition

Docker Application Guides demonstrate how to deploy popular enterprise applications – Oracle WebLogic and IBM MQ with WebSphere Liberty – on Docker Enterprise and Docker Desktop. Application Guides include example architectures and guidance for selecting Certified Docker container images from Docker Store and deploying a prototype application, orchestrated by Docker Swarm or Kubernetes.

It is important to note that Docker Application Guides are one piece of our prescriptive Docker customer journey to production. In addition to the knowledge transfer and process transformation that come with our full approach, Application Guides provide a reference for deploying common enterprise applications on the Docker Enterprise platform.

The first Docker Application Guides are designed to help you plan and deploy an Oracle WebLogic application stack or an IBM MQ with WebSphere Liberty sample stack on either Docker Desktop, for local development and testing, or Docker Enterprise, which is where the applications would run in production. The Application Guides include instructions for orchestrating with either Docker Swarm or Kubernetes, demonstrating the flexibility of the Docker container platform.

IBM MQ and IBM WebSphere Liberty on Docker Enterprise Edition

Where to Access Docker Application Guides

Docker Application Guides are built for Docker Enterprise and are available via the Docker Success Center.

You can access each Docker Application Guide using the links below:

You can deploy and test enterprise applications on Docker Enterprise and Docker Desktop, both of which include both Swarm and Kubernetes orchestration:

  • Get Docker Enterprise, the leading enterprise-ready container platform to cost-effectively build and manage your entire application portfolio
  • Get Docker Desktop for macOS or Windows, the simplest way to build and test applications on your own machine

What’s Next?

We’d love your feedback on additional capabilities and applications you’d like to see us include in our Application Guides. We already have some ideas and are working on the additional guides but ultimately these are for you so let us know what you’d like to see next. You can send us feedback on the Docker Community Slack channel #docker-ee-tools.


Introducing #Docker Application Guides for @Oracle WebLogic and @IBM MQ with WebSphere Liberty
Click To Tweet


Calls to action:

The post Introducing Docker Application Guides appeared first on Docker Blog.

via Docker Blog
Introducing Docker Application Guides

Lower colon cancer death risk among diet-soda drinkers






New research finds an association between drinking artificially sweetened beverages and a significantly lower risk of colon cancer recurrence and cancer death.

“Artificially sweetened drinks have a checkered reputation in the public because of purported health risks that have never really been documented,” says senior author Charles S. Fuchs, director of the Yale University Cancer Center. “Our study clearly shows they help avoid cancer recurrence and death in patients who have been treated for advanced colon cancer, and that is an exciting finding.”

“…in terms of colon cancer recurrence and survival, use of artificially sweetened drinks is not a health risk, but is, in this study, a healthier choice.”

Fuchs and his team of researchers found that in the 1,018-patient analysis, those participants who drank one or more 12-ounce serving of artificially sweetened beverages per day experienced a 46 percent improvement in risk of cancer recurrence or death, compared to those who didn’t drink these beverages. Researchers defined these soft drinks as caffeinated colas, caffeine-free colas, and other carbonated beverages (such as diet ginger ale).

A second analysis found that about half that benefit was due to substituting an artificially sweetened beverage for a beverage sweetened with sugar.

“While the association between lower colon cancer recurrence and death was somewhat stronger than we suspected, the finding fits in with all that we know about colon cancer risk in general,” Fuchs says. “Factors such as obesity, sedentary lifestyle, a diet linked to diabetes—all of which lead to an excess energy balance—are known risk factors. We now find that, in terms of colon cancer recurrence and survival, use of artificially sweetened drinks is not a health risk, but is, in this study, a healthier choice.”

“…after cancer has developed and advanced, would a change in lifestyle—drinking artificially sweetened beverages—change the outcome of the cancer post-surgery?”

This research follows on the heels of a number of studies that prospectively followed stage III colon cancer patients enrolled in a National Cancer Institute-supported clinical trial testing two different forms of postsurgical chemotherapy.

Participants completed comprehensive nutrition questionnaires probing consumption of more than 130 different foods and drinks over the span of many months. One questionnaire took place as patients underwent chemotherapy between 1999 and 2001, and then again six months after chemotherapy ended. Investigators then tracked cancer recurrence and patient death rates for about seven years, and found, among other things, that the two chemotherapy regiments offered equitable benefits.

Researchers designed the studies, which were embedded as part of the overall clinical trial, to find associations between specific foods/drinks and colon cancer risk and death. They were not aiming to prove definitive cause and effects.

One study found that clinical trial participants who drank coffee had a substantially reduced risk of cancer recurrence and death. Another found a similar benefit in patients who ate tree nuts. This study looked at artificially sweetened beverages because an earlier study had concluded sweetened beverages dramatically increased risk of colon cancer development.

“We wanted to ask the question if, after cancer has developed and advanced, would a change in lifestyle—drinking artificially sweetened beverages—change the outcome of the cancer post-surgery?” Fuchs says.

He adds that the health impact of such soft drinks warrants study: “Concerns that artificial sweeteners may increase the incidence of obesity, diabetes, and cancer have been raised, but studies on issues such as weight gain and diabetes have been very mixed, and, regarding cancer, epidemiologic studies in humans have not demonstrated such relationships.”

The study appears in PLOS ONE. The National Cancer Institute of the National Institutes of Health supported the research in part. Pharmacia & Upjohn Company, now Pfizer Oncology, and the American Institute for Cancer Research provided additional support.

Non-federal sponsors did not participate in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript, or decision to submit the manuscript for publication.

Source: Yale University






via Futurity.org
Lower colon cancer death risk among diet-soda drinkers