MySQL Workbench 8.0.19 has been released

MySQL Workbench 8.0.19 has been released

https://ift.tt/2NmAS69

Dear MySQL users,

The MySQL developer tools team announces 8.0.19 as our
General Availability
(GA) for MySQL Workbench 8.0.

For the full list of changes in this revision, visit
https://dev.mysql.com/doc/relnotes/workbench/en/news-8-0.html

For discussion, join the MySQL Workbench Forums:
http://forums.mysql.com/index.php?152

The release is now available in source and binary form
for a number of
platforms from our download pages at:

http://dev.mysql.com/downloads/tools/workbench/

Enjoy!

via MySQL Workbench 5.2.35 https://ift.tt/2K1w83p

January 13, 2020 at 01:16PM

Tips for Delivering MySQL Database Performance – Part Two

Tips for Delivering MySQL Database Performance – Part Two

https://ift.tt/2QFnrQJ

The management of database performance is an area that businesses when administrators often find themselves contributing more time to than they expected.

Monitoring and reacting to the production database performance issues is one of the most critical tasks within a database administrator job. It is an ongoing process that requires constant care. Application and underlying databases usually evolve with time; grow in size, number of users, workload, schema changes that come with code changes.

Long-running queries are seldom inevitable in a MySQL database. In some circumstances, a long-running query may be a harmful event. If you care about your database, optimizing query performance, and detecting long-running queries must be performed regularly. 

In this blog, we are going to take a more in-depth look at the actual database workload, especially on the running queries side. We will check how to track queries, what kind of information we can find in MySQL metadata, what tools to use to analyze such queries.

Handling The Long-Running Queries

Let’s start with checking Long-running queries. First of all, we have to know the nature of the query, whether it is expected to be a long-running or a short running query. Some analytic and batch operations are supposed to be long-running queries, so we can skip those for now. Also, depending on the table size, modifying table structure with ALTER command can be a long-running operation (especially in MySQL Galera Clusters).

  • Table lock – The table is locked by a global lock or explicit table lock when the query is trying to access it.
  • Inefficient query – Use non-indexed columns while lookup or joining, thus MySQL takes a longer time to match the condition.
  • Deadlock – A query is waiting to access the same rows that are locked by another request.
  • Dataset does not fit into RAM – If your working set data fits into that cache, then SELECT queries will usually be relatively fast.
  • Suboptimal hardware resources – This could be slow disks, RAID rebuilding, saturated network, etc.

If you see a query takes longer than usual to execute, do investigate it.

Using the MySQL Show Process List

​MYSQL> SHOW PROCESSLIST;

This is usually the first thing you run in the case of performance issues. SHOW PROCESSLIST is an internal mysql command which shows you which threads are running. You can also see this information from the information_schema.PROCESSLIST table or the mysqladmin process list command. If you have the PROCESS privilege, you can see all threads. You can see information like Query Id, execution time, who runs it, the client host, etc. The information with slightly wary depending on the MySQL flavor and distribution (Oracle, MariaDB, Percona)

SHOW PROCESSLIST;

+----+-----------------+-----------+------+---------+------+------------------------+------------------+----------+

| Id | User            | Host | db | Command | Time | State                  | Info | Progress |

+----+-----------------+-----------+------+---------+------+------------------------+------------------+----------+

|  2 | event_scheduler | localhost | NULL | Daemon  | 2693 | Waiting on empty queue | NULL   | 0.000 |

|  4 | root            | localhost | NULL | Query   | 0 | Table lock   | SHOW PROCESSLIST | 0.000 |

+----+-----------------+-----------+------+---------+------+------------------------+------------------+----------+

we can immediately see the offensive query right away from the output. In the above example that could be a Table lock.  But how often do we stare at those processes? This is only useful if you are aware of the long-running transaction. Otherwise, you wouldn’t know until something happens – like connections are piling up, or the server is getting slower than usual.

Using MySQL Pt-query-digest

If you would like to see more information about a particular workload use pt-query-digest.  The pt-query-digest is a Linux tool from Percona to analyze MySQL queries. It’s part of the Percona Toolkit which you can find here. It supports the most popular 64 bit Linux distributions like Debian, Ubuntu, and Redhat. 

To install it you must configure Percona repositories and then install the perona-toolkit package.

Install Percona Toolkit using your package manager:

Debian or Ubuntu:

sudo apt-get install percona-toolkit

RHEL or CentOS:

sudo yum install percona-toolkit

Pt-query-digest accepts data from the process list, general log, binary log, slow log or tcpdump In addition to that, it’s possible to poll the MySQL process list at a defined interval – a process that can be resource-intensive and far from ideal, but can still be used as an alternative.

The most common source for pt-query-digest is a slow query log. You can control how much data will go there with parameter log_slow_verbosity.  

There are a number of things that may cause a query to take a longer time to execute:

  • microtime – queries with microsecond precision.
  • query_plan – information about the query’s execution plan.
  • innodb  – InnoDB statistics.
  • minimal – Equivalent to enabling just microtime.
  • standard – Equivalent to enabling microtime,innodb.
  • full – Equivalent to all other values OR’ed together without the profiling and profiling_use_getrusage options.
  • profiling – Enables profiling of all queries in all connections.
  • profiling_use_getrusage – Enables usage of the getrusage function.

source: Percona documentation

For completeness use log_slow_verbosity=full which is a common case.

Slow Query Log

The slow query log can be used to find queries that take a long time to execute and are therefore candidates for optimization. Slow query log captures slow queries (SQL statements that take more than long_query_time seconds to execute), or queries that do not use indexes for lookups (log_queries_not_using_indexes). This feature is not enabled by default and to enable it simply set the following lines and restart the MySQL server:

[mysqld]
slow_query_log=1
log_queries_not_using_indexes=1
long_query_time=0.1

The slow query log can be used to find queries that take a long time to execute and are therefore candidates for optimization. However, examining a long slow query log can be a time-consuming task. There are tools to parse MySQL slow query log files and summarize their contents like mysqldumpslow, pt-query-digest.

Performance Schema

Performance Schema is a great tool available for monitoring MySQL Server internals and execution details at a lower level. It had a bad reputation in an early version (5.6) because enabling it often caused performance issues, however the recent versions do not harm performance. The following tables in Performance Schema can be used to find slow queries:

  • events_statements_current
  • events_statements_history
  • events_statements_history_long
  • events_statements_summary_by_digest
  • events_statements_summary_by_user_by_event_name
  • events_statements_summary_by_host_by_event_name

MySQL 5.7.7 and higher include the sys schema, a set of objects that helps DBAs and developers interpret data collected by the Performance Schema into a more easily understandable form. Sys schema objects can be used for typical tuning and diagnosis use cases.

Network tracking

What if we don’t have access to the query log or direct application logs. In that case, we could use a combination of tcpdump and pt-query digest which could help to capture queries.

$ tcpdump -s 65535 -x -nn -q -tttt -i any port 3306 > mysql.tcp.txt

Once the capture process ends, we can proceed with processing the data:

$ pt-query-digest --limit=100% --type tcpdump mysql.tcp.txt > ptqd_tcp.out

ClusterControl Query Monitor

ClusterControl Query Monitor is a module in a cluster control that provides combined information about database activity. It can gather information from multiple sources like show process list or slow query log and present it in a pre-aggregated way. 

ClusterControl Top Queries

The SQL Monitoring is divided into three sections.

Top Queries

presents the information about queries that take a significant chunk of resources.

ClusterControl Top Queries

Running Queries

it’s a process list of information combined from all database cluster nodes into one view. You can use that to kill queries that affect your database operations.

ClusterControl Running Queries

Query Outliers

present the list of queries with execution time longer than average.

ClusterControl Query Outliners

Conclusion

This is all for part two. This blog is not intended to be an exhaustive guide to how to enhance database performance, but it hopefully gives a clearer picture of what things can become essential and some of the basic parameters that can be configured. Do not hesitate to let us know if we’ve missed any important ones in the comments below.

 

technology

via Planet MySQL https://ift.tt/2iO8Ob8

January 10, 2020 at 09:11PM

Tickets on Sale for Ohio Outdoor Life/Field & Stream Expo, Columbus, March 20-22

Tickets on Sale for Ohio Outdoor Life/Field & Stream Expo, Columbus, March 20-22

https://ift.tt/3dedq6u

Ohio Outdoor Life/Field & Stream Expo
Ohio Outdoor Life/Field & Stream Expo

COLUMBUS, Ohio – -(AmmoLand.com)- Tickets are now on sale for the Ohio Outdoor Life/Field & Stream Expo Presented by Suzuki KingQuad in Columbus, Ohio, Friday, March 20 through Sunday, March 22 at the Ohio Expo Center in the Celeste Center and Bricker Building and can be purchased online at https://ift.tt/2UcgeIA.

The Expo features the latest products, gear and equipment for hunting, fishing and outdoor enthusiasts. Attendees will have the chance to meet industry insiders, attend educational seminars and demonstrations, test out the newest products, shop for show deals and even bring their deer to have it scored.

Single-day tickets are $10.50/Adults (Ages 18+) in advance and $3/Youth, (ages 13-17). Kids ages 12-under are free, but will need a ticket, which can be obtained online or at the box office. Adult 2-Day tickets are also available in advance for $20, with a 2-day Youth ticket $6 (ages 13-17).

A special Family 4-Pack is available for just $25, which includes admission for any one day of the show for 2 Adults and 2 Youths (ages 13-17). All four attendees must enter the show at the same time for this offer. The Outdoor Life/Field & Stream Expo is a fun-filled family experience, including archery ranges for kids and numerous Door Prize giveaways.

Save money by purchasing tickets in advance online as ticket prices will increase the day of the show at the door.

A 1-year subscription to Field & Stream or Outdoor Life magazine is included with paid admission as well as door prize entry.

Expo Features include:

  • Hundreds of exhibitors, many offering special show-pricing
  • Seminars and demos, with industry insiders and experts
  • Family fun, including kids archery
  • Deer Scoring
  • Daily Door Prizes

Outdoor Life/Field & Stream ExpoMore details on Event Features and Seminars coming soon.

For more information on the Ohio Outdoor Life/Field & Stream Expo Presented by Suzuki KingQuad as well as to purchase tickets online, visit https://ift.tt/2UcgeIA.

The Ohio Outdoor Life/Field & Stream Expo is sponsored by Suzuki KingQuad.

The post Tickets on Sale for Ohio Outdoor Life/Field & Stream Expo, Columbus, March 20-22 appeared first on AmmoLand.com.

guns

via AmmoLand.com https://ift.tt/2okaFKE

January 9, 2020 at 11:50AM

‘Doom’ re-releases now support add-ons, quick saves and 60FPS

‘Doom’ re-releases now support add-ons, quick saves and 60FPS

https://ift.tt/2TcaGPh

Bethesda’s re-releases of the first two Doom games are about catch up to the originals in key areas — and in a few ways, surpass them. The id Software titles are receiving updates that, among other things, introduce support for add-ons — yes, even on mobile. This doesn’t mean you can load in any old WAD file on consoles, but it will work for Android and PC players — and everyone will get a mix of official and unofficial add-ons. The initial selection includes the two Final Doom mods (The Plutonia Experiment and TNT: Evilution), No Rest for the Living and John Romero’s Sigil. Other packs will be available on a "regular basis."

Most add-ons will be limited to the particular game they’re built for (the Final Doom mods are an "exception"). However, you’ll be glad to hear that the developers are lifting engine limitations to allow for MegaWADs, and they’re even introducing real-time MIDI playback to preserve the custom music in some mods.

There are some notable practical improvements beyond this. Both Doom titles now run at 60 frames per second on all platforms instead of the earlier 35FPS. There’s an aspect ratio option to match the 4:3 output of the original game. You can quick save (and quick load) if you’re worried you won’t survive an encounter. Level select lets you skip to a favorite map. And if you’re playing on a device with a gamepad, quick weapon select and a weapon carousel will help you quickly switch to the gun you need.

In many ways, this upgrade is as much about preserving the spirit of Doom as it is enticing modern gamers. The initial releases are widely credited with nurturing the very concept of game add-ons — it’d be difficult to convey this to a new generation of players if they were forced to play the built-in levels and nothing else.

Source: Bethesda

geeky,Tech,Database

via Engadget http://www.engadget.com

January 9, 2020 at 03:09PM

A Continuous Roll of Surebonder Hot Glue Sticks?

A Continuous Roll of Surebonder Hot Glue Sticks?

https://ift.tt/2tFFld4

Surebonder 5ft Hot Glue Stick Roll in use

Did you know that you can buy your glue sticks on a roll? Sure you can find glue sticks in 6″, 10″, and even 15″ lengths, but what if you need even more than that? That’s where Surebonder’s glue rolls come in.

No longer do you have to stop and waste time loading glue sticks. These clear glue rolls work in low, high, and dual temperature glue guns and are made in the USA.

The glue rolls come in four options:

  • RM-5 mini-size 5 ft. roll ~$4-5
  • RR-5 full-size 5 ft. roll ~$8-9
  • RM-188 mini-size 188 ft. roll ~$37
  • RR-77 full-size 77 ft. roll ~$36-38
Surebonder 77ft Hot Glue Stick Roll in box

I’ve found two reliable retailers online selling the glue rolls: Amazon and Walmart. Their prices don’t exactly match the list price. For example at Amazon the 5 ft standard roll is way more expensive at almost $9. Walmart is more reasonable at $4 (plus shipping). Both retailers sell the 77 foot standard roll for around $36.

Buy Now via Amazon
Buy Now: 5ft roll via Walmart
Buy Now: 77ft roll via Walmart

Discussion

This seems like an obscure product to post, but I’m of two minds about it’s usefulness and thought I’d ask our readers for their opinion.

When I first saw it, I thought: “hey, that’s a pretty good idea!” After thinking about it for a while I thought: “that would have very limited practical use.”

My opinion was turning away from: “this is a good idea,” until I thought, “wait a minute, you don’t have to drag around the whole roll, you can cut it to any length you want.”

I’ve used several types of glue sticks: the minis, which seem to run out way to fast, the short standard size sticks which also see to run out too fast, and the longer 15″ sticks that seem to be about right. Still there are times where it would be nice not to run out in the middle of using the glue gun, so I could see where cutting off a 2 or 3 foot piece would be very handy.

What uses can you think of for a long length glue roll?

hacking

via ToolGuyd https://toolguyd.com

January 8, 2020 at 10:11AM

DIY Suppressor

DIY Suppressor

https://ift.tt/3aXVVoU

Here’s How to Legally Make Your Own Can, Save Yourself a Year of Wait Time, and a Few Hundred Bucks

Photos by Kenda Lenseigne

Making your own firearm from a partially completed receiver is something we’ve covered extensively in the past, but there’s a perfectly legal route to achieving the same ends with a suppressor, which in many ways is more effective and attractive for the average law-abiding citizen. In order to transfer a factory-made can from your local dealer, you’ll have to fill out an ATF Form 4, pay a $200 tax, submit fingerprints and a passport photo, and then wait. And wait.

After around 11 months, you might get the chance to go pick up the property you paid for so long ago, or you might just have to wait a bit longer. If this doesn’t sound quite so appealing, then you could always go the DIY route. There’s no getting around the paperwork and tax stamp, but you end up with a workable solution in weeks, rather than months. We covered the Form 1 E-File process in Issue 44, and the article is currently on RECOILweb.com, so if you haven’t already, you may want to familiarize yourself with it. There’s one departure from the procedure outline in the article, and that’s the bit which deals with describing the manufacturer of the item.

On the drop-down menu, select FMI (for Form 1 manufactured), and you’re on your way. You’ll also have to describe the length and caliber of the can you’re going to make. Tip: Some people get wrapped around the axle when it comes to fingerprints. There’s absolutely no reason to make an appointment and pay a third party to fingerprint you, when you’re perfectly capable of smearing ink on your own digits. Order a fingerprint kit from Amazon, and do it yourself in the comfort of your own home.

Once your Form 1 has been approved, which usually takes around three weeks, you can then buy a tube, spacers, baffles, and endcaps from the many online vendors that exist on the fringes of the interwebs. Due to the nature of NFA law, these will be described in rather coy terms, and you may wind up purchasing “barrel shrouds,” “solvent traps,” “oil filter kits,” or “storage cups,” all of which are largely useless for their advertised purpose, but give the vendors a fig leaf of deniability. Yes, it’s all a bunch of bullsh*t, but it’s the system we’re stuck with.

Once your components arrive, you can then set to work engraving the tube to meet the legal requirements of the National Firearms Act (see RECOIL Issue 44). You could go get this done on a laser engraver and make it look all professional-like, or you could just bust out the Dremel. We did the latter, as it’s going to be wrapped in a suppressor cover anyway. With your tube engraved, you can then drill holes in the baffles and endcap, screw everything together, and head to the range with your shiny new can. Enjoy!


Source
ATF E-Forms: https://eforms.atf.gov/
Form 1 Suppressor forum: http://form1suppressor.boards.net/
Parts Vendors: https://superprecisionconcepts.com/
https://sdtacticalarms.com/

guns

via Recoil https://ift.tt/2ycx5TA

January 7, 2020 at 09:02AM

Facebook bans deceptive deepfakes and some misleadingly modified media

Facebook bans deceptive deepfakes and some misleadingly modified media

https://ift.tt/2ZWEMrq

Facebook wants to be the arbiter of truth after all. At least when it comes to intentionally misleading deepfakes and heavily manipulated and/or synthesized media content, such as AI-generated photorealistic human faces that look like real people but aren’t.

In a policy update announced late yesterday, the social network’s VP of global policy management, Monika Bickert, writes that it will take a stricter line on manipulated media content from here on in — removing content that’s been edited or synthesized “in ways that aren’t apparent to an average person and would likely mislead someone into thinking that a subject of the video said words that they did not actually say”.

However edits for quality or cuts and splices to videos that simply curtail or change the order of words are not covered by the ban.

Which means that disingenuous doctoring — such as this example from the recent UK General Election (where campaign staff for one political party edited a video of a politician from a rival party who was being asked a question about brexit to make it look like he was lost for words when in fact he wasn’t) — will go entirely untouched by the new ‘tougher’ policy. Ergo there’s little to trouble Internet-savvy political ‘truth’ spinners here. The disingenuousness digital campaigning can go on.

Instead of grappling with that sort of subtle political fakery, Facebook is focusing on quick PR wins — around the most obviously inauthentic stuff where it won’t risk accusations of partisan bias if it pulls bogus content.

Hence the new policy bans deepfake content that involves the use of AI technologies to “merge, replace or superimpose content onto a video, making it appear to be authentic” — which looks as if it will capture the crudest stuff, such as revenge deepfake porn which superimposes a real person’s face onto an adult performer’s body (albeit nudity is already banned on Facebook’s platform).

It’s not a blanket ban on deepfakes either, though — with some big carve outs for “parody or satire”.

So it’s a bit of an open question whether this deepfake video of Mark Zuckerberg, which went viral last summer — seemingly showing the Facebook founder speaking like a megalomaniac — would stay up or not under the new policy. The video’s creators, a pair of artists, described the work as satire so such stuff should survive the ban. (Facebook did also leave it up at the time.)

But, in future, deepfake creators are likely to further push the line to see what they can get away with under the new policy.

The social network’s controversial policy of letting politicians lie in ads also means it could, technically, still give pure political deepfakes a pass — i.e. if a political advertiser was paying it to run purely bogus content as an ad. Though it would be a pretty bold politician to try that.

More likely there’s more mileage for political campaigns and opinion influencers to keep on with more subtle manipulations. Such as the doctored video of House speaker Nancy Pelosi that went viral on Facebook last year, which had slowed down audio that made her sound drunk or ill. The Washington Post suggests that video — while clearly potentially misleading — still wouldn’t qualify to be taken down under Facebook’s new ‘tougher’ manipulated media policy.

Bickert’s blog post stipulates that manipulated content which doesn’t meet Facebook’s new standard for removal may still be reviewed by the independent third party fact-checkers Facebook relies upon for the lion’s share of ‘truth sifting’ on its platform — and who may still rate such content as ‘false’ or ‘partly false’. But she emphasizes it will continue to allow this type of bogus content to circulate (while potentially reducing its distribution), claiming such labelled fakes provide helpful context.

So Facebook’s updated position on manipulated media sums to ‘no to malicious deepfakes but spindoctors please carry on’.

“If a photo or video is rated false or partly false by a fact-checker, we significantly reduce its distribution in News Feed and reject it if it’s being run as an ad. And critically, people who see it, try to share it, or have already shared it, will see warnings alerting them that it’s false,” Bickert writes, claiming: “This approach is critical to our strategy and one we heard specifically from our conversations with experts.

“If we simply removed all manipulated videos flagged by fact-checkers as false, the videos would still be available elsewhere on the internet or social media ecosystem. By leaving them up and labelling them as false, we’re providing people with important information and context.”

Last month Facebook announced it had unearthed a network of more than 900 fake accounts that had been spreading pro-Trump messaging — some of which had used false profile photos generated by AI.

The dystopian development provides another motivation for the tech giant to ban ‘pure’ AI fakes, given the technology risks supercharging its fake accounts problem. (And, well, that could be bad for business.)

“Our teams continue to proactively hunt for fake accounts and other coordinated inauthentic behavior,” suggests Bickert, arguing that: “Our enforcement strategy against misleading manipulated media also benefits from our efforts to root out the people behind these efforts.”

While still relatively nascent as a technology, deepfakes have shown themselves to be catnip to the media which loves the spectacle they create. As a result, the tech has landed unusually quickly on legislators’ radars as a disinformation risk — California implemented a ban on political deepfakes around elections this fall, for example — so Facebook is likely hoping to score some quick and easy political points by moving in step with legislators even as it applies its own version of a ban.

Bickert’s blog post also fishes for further points, noting Facebook’s involvement in a Deep Fake Detection Challenge which was announced last fall — “to produce more research and open source tools to detect deepfakes”.

While says Facebook has been working with news agency Reuters to offer free online training courses for journalists to help reporters identify manipulated visuals.

“As these partnerships and our own insights evolve, so too will our policies toward manipulated media. In the meantime, we’re committed to investing within Facebook and working with other stakeholders in this area to find solutions with real impact,” she adds.

technology

via TechCrunch https://techcrunch.com

January 7, 2020 at 06:01AM