Multi-Cloud SaaS Applications: Speed + Availability = Success!

In this blog post, we talk about how to run applications across multiple clouds (i.e. AWS, Google Cloud, Microsoft Azure) using Continuent Clustering. You want your business-critical applications to withstand node, datacenter, availability-zone or regional failures. For SaaS apps, you also want to bring data close to your application users for faster response times and a better user experience. With cross-cloud capability, Continuent also helps avoid lock-in to any particular cloud provider

The key to success for the database layer is to be available and respond rapidly.

From both a business and operational perspective, spreading the application across cloud environments from different vendors provides significant protection against vendor-specific outages and vendor lock-in. Running on multiple platforms provides greater bargaining leverage with each vendor, because they do not have a monopoly on your cloud operations.

Continuent Clustering is flexible and platform-agnostic, which means it allows me to run across many different environments. You may mix-and-match any and all together (i.e. one or more clusters of 3 nodes each per environment), which means I can have a cluster spanning AWS, GCS, Azure and even include my bare-metal datacenter. For cost savings, test/dev/etc. can be put on VM’s, Docker containers or even VirtualBox.

There are many challenges to running an application over large distances, including network latency for reads and getting local writes distributed to other regions.

Continuent Clustering provides a cohesive solution which addresses the various concerns when running a geo-distributed application.

Let’s look at the various factors, and how each is handled:

Be available

  • (local) – when looked at from a local perspective, this is considered high availability (HA). If the MySQL database, which is handling writes (and reads) should become unavailable, automatically switch to another server with the same information and proceed to serve writes and reads to the application layer with as little down-time as possible.
  • (global) – when looked at from a global perspective, this is called disaster recovery (DR). Should an entire site, region, availability zone or even cloud become unavailable, allow another site with the same information to serve writes and reads to the application layer with as little down-time as possible.
  • (global) – the Tungsten Replicator is cluster-aware, which means that in the event of the complete loss of a remote node to obtain data from, the Replicator is able to automatically switch to another source node and pick up where it left off.
  • (global) – the Tungsten Connector is cluster- and site-aware, and is able to route both read and write requests to both local and remote resources. When the Connector is installed directly on an application server, a Multimaster cluster topology can withstand the loss of the entire database layer at one site by redirecting reads and writes to another region.

Respond rapidly to requests

  • (local) – by using the failover replicas as read sources, we are able to offload the requests from the master, freeing up valuable resources on the master (i.e. CPU, memory, disk I/O, network bandwidth, etc.). This has the double effect of increasing performance on the write master and improving response time for reads.
  • (global) – employing active/active multimaster clustering, writes to each region are replicated to all other regions. This then makes the data available for local reads, so the database layer is able to respond much more quickly to requests for specific data which otherwise would have to be fetched from a remote site over the WAN, adding precious milliseconds to every query.
  • (global) – the built-in Tungsten Replicator provides loosely-coupled asynchronous data transfer both to local read replicas as well as to all remote sites. Given that WAN connections sometimes have high latency and even complete disconnects, the Replicator is able to track every event, pause when the link is down and resume when the link becomes available.

All of the above facets combine to make a polished diamond of a solution, fit for your company’s worldwide enterprise-quality deployment!

If you are interested in running a proof-of-concept, please contact us now!

via Planet MySQL
Multi-Cloud SaaS Applications: Speed + Availability = Success!

Too much of this protein may drive kidney cancer

Scientists have uncovered a potential therapeutic target for kidney cancers that have a common genetic change.

Scientists have known this genetic change can lead to an overabundance of blood vessels, which help feed nutrients to the tumors, but the latest finding shows a potential new cancer-driving pathway.

More than 90 percent of the most common type of kidney cancer have a genetic change that leads to the loss of an important tumor suppressor gene called VHL.

In the study, which appears in Science, researchers identify a new downstream effect of this genetic change that is helping to drive kidney cancer: A protein called ZHX2 over-accumulates in these cells and helps to turn on other signals involved in cancerous growth.

The findings suggest that the protein is a potential new therapeutic target for clear cell renal cell carcinoma, which is the most common type of kidney cancer.

Losing VHL

“If you lose VHL, you will accumulate lots of this ZHX2 protein, which will turn on signals that promote kidney cancer,” says Qing Zhang, an assistant professor in the pathology & laboratory medicine and pharmacology department at the UNC School of Medicine. “This protein could be a potential therapeutic target used to treat kidney cancer on its own or in combination. The next step is to try to figure out how we can target it therapeutically.”

Clear cell renal cell carcinoma accounts for about 70 percent of all cases of kidney cancer, researchers report. Approximately 90 percent of patients with clear cell renal cell carcinoma have genetic mutations or alterations that cause them to lose the function of VHL. When the function of VHL is gone, cells can accumulate signals that trigger blood vessels to grow.

“VHL is the most important tumor suppressor in clear cell renal cell carcinoma,” Zhang says. “There are extensive reports showing that from initiation to tumor progression to metastasis—during the whole process of kidney cancer development—VHL plays a central role.

“It is important to understand how the VHL loss contributes to kidney cancer, and how we can therapeutically target the downstream effects of this loss in kidney cancer.”

“We wanted to understand, once VHL is lost, what else in kidney cancer cells is promoting oncogenesis?”

There are US Food and Drug Administration-approved drugs that block cell signals involved in abnormal blood vessel production—which is a downstream effect of VHL loss—that are part of the standard of care for clear cell renal cell carcinoma. Patients can show little response to these drugs or can develop resistance, so Zhang and his colleagues wanted to search for other targets that accumulate in cells lacking VHL function that help to drive the abnormal cancerous growth.

“We wanted to understand, once VHL is lost, what else in kidney cancer cells is promoting oncogenesis?” Zhang says. “Therapeutically speaking, we’re trying to understand how we can target these novel signaling pathways, once we identify them.”

Too much ZHX2

The researchers created a screening technique to discover new molecules that might help drive cancer when VHL is lost. This led them to determine that kidney cancer cells lacking VHL usually had more ZHX2. By eliminating ZHX2 from their laboratory models, they inhibited cancer cell growth, invasion, and the cancer’s spread. In addition, they saw that it was involved with signals that can help cancer cells grow.

Belly fat cuts survival odds for women with kidney cancer

There have been major advances in the treatment of kidney cancer with the development of molecularly-targeted therapies and immune-based treatments, says William Kim, associate professor of medicine and genetics at the UNC School of Medicine. However, additional treatments are needed to reach more patients with metastatic disease.

“The vast majority of kidney cancers have mutations in VHL, so it makes it a very important gene to investigate,” Kim says. “In the last decade or more, we’ve had quite a number of major treatment advances in kidney cancer. There are nearly a dozen FDA-approved treatments now for this disease, but many of them are similar.

Androgen receptor affects the spread of kidney cancer

“Studies like this are important because they delineate the underlying biology of kidney cancer and identify novel, distinct pathways to develop drugs against.”

A US Department of Defense Career Development Award, the University Cancer Research Fund, and the National Cancer Institute funded the work.

Source: UNC-Chapel Hill

The post Too much of this protein may drive kidney cancer appeared first on Futurity.

via Futurity.org
Too much of this protein may drive kidney cancer

Episode 165 Scott Adams: Meeting President Trump (without details of course), Don Lemon Tweet

Topics: 

  • What it was like, to meet President Trump
  • President Trump’s charisma levels are off the chart impressive
  • Dumbest guy on television, the Don Lemon tweet

 

I fund my Periscopes and podcasts via audience micro-donations on Patreon. I prefer this method over accepting advertisements or working for a “boss” somewhere because it keeps my voice independent. No one owns me, and that is rare. I’m trying in my own way to make the world a better place, and your contributions help me stay inspired to do that.

See all of my Periscope videos here.

Find my WhenHub Interface app here.

The post Episode 165 Scott Adams: Meeting President Trump (without details of course), Don Lemon Tweet appeared first on Dilbert Blog.

via Dilbert Blog
Episode 165 Scott Adams: Meeting President Trump (without details of course), Don Lemon Tweet

Kasich signs bill protecting businesses that invest in data security

Wary of data breaches and a mounting challenge for businesses to protect their digital assets, Gov. John Kasich signed into law on Friday a bill that aims to prod businesses to beef up security. Senate Bill 220 creates a legal incentive for companies to voluntarily invest in better cybersecurity to protect customer information. The law, introduced in the fall by Sen. Bob Hackett, R-London, and State Sen. Kevin Bacon, R-Minerva Park, provides a legal "safe harbor" for companies that take steps…

via Columbus Business News – Local Columbus News | Business First of Columbus
Kasich signs bill protecting businesses that invest in data security

Database Objects migration to RDS/ Aurora (AWS)

The world of application and its related services are migrating more towards cloud, because of availability, Elasticity, Manageability etc. While moving the entire stack we need to be very cautious while migrating the database part.

Migration of DB servers is not a simple lift and shift operation, Rather it would require a proper planning and more cautious in maintaining data consistency with existing DB server and cloud server by means of native replication or by using any third party tools.

The best way to migrate the existing MySQL database to RDS, in my opinion, is by using “logical backup“. Some of the logical backup tools as below,

Mysqldump — single threaded (widely used)
Mysqlpump — Multithreaded
Mydumper — Multithreaded

In this blog, we will see about a simple workaround and best practices to migrate DB objects such as procedures, triggers, etc from a existing database server on premises to Amazon RDS (MySQL), which is a fully managed relational database service provided by AWS.

In order to provide managed services, RDS restricts certain privileges at the user level. Below are the list of restricted privileges in RDS.

  • SUPER – Enable use of other administrative operations such as CHANGE MASTER TO, KILL (any connection), PURGE BINARY LOGS, SET GLOBAL, and mysqladmin debug command. Level: Global.
  • SHUTDOWN – Enable use of mysqladmin shutdown. Level: Global.
  • FILE – Enable the user to cause the server to read or write files. Level: Global.
  • CREATE TABLESPACE – Enable tablespaces and log file groups to be created, altered, or dropped. Level: Global.

All stored programs (procedures, functions, triggers, and events) and views can have a DEFINER attribute that names a MySQL account. As shown below.

DELIMITER ;;
CREATE DEFINER=`xxxxx`@`localhost` PROCEDURE `prc_hcsct_try`(IN `contactId` INT, IN `section` VARCHAR(255))
BEGIN
IF NOT EXISTS (SELECT 1 FROM contacts_details WHERE contact_id = contactId) THEN
INSERT INTO contacts_details (contact_id, last_touch_source, last_touch_time) VALUES (contactId, section, NOW());
ELSE
UPDATE contacts_details SET last_touch_source = section, last_touch_time = NOW() WHERE contact_id = contactId;
END IF;
END ;;
DELIMITER ;

While restoring same on to the RDS server, since the RDS doesn’t provide a SUPER privilege to its user, The restoration fails with the below error, since it fails

ERROR 1227 (42000) at line 15316: Access denied; you need (at least one of) the SUPER privilege(s) for this operation

This will be very annoying since the restore fails at the end,

To overcome this below is the simple one-liner piped with the mysqldump command, which replaces the “DEFINER=`xxxxx`@`localhost`”, So when you are restoring the dump file, the definer will be a user which is used to restore

mysqldump -u user -p -h 'testdb.xcvadshkgfd..us-east-1.rds.amazonaws.com' --single-transaction --quick --triggers --routines --no-data --events testdb | perl -pe 's/\sDEFINER=`[^`]+`@`[^`]+`//' > test_dump.sql

Below is the content from the dump file after ignoring the default “DEFINER”, the same can also be done vis AWK and SED commands too.

DELIMITER ;;
CREATE PROCEDURE `prc_contact_touch`(IN `contactId` INT, IN `section` VARCHAR(255))
BEGIN
IF NOT EXISTS (SELECT 1 FROM contacts_details WHERE contact_id = contactId) THEN
INSERT INTO contacts_details (contact_id, last_touch_source, last_touch_time) VALUES (contactId, section, NOW());
ELSE
UPDATE contacts_details SET last_touch_source = section, last_touch_time = NOW() WHERE contact_id = contactId;
END IF;
END ;;
DELIMITER ;

As you can see from the above the DEFINER section is completely removed.

Best practices for RDS migration,

1, Restore dump files from EC2 within the same VPC and RDS to have minimal network latency
2, Increase max_allowed_packet to 1G(max), to accommodate bigger packets
3, Dump data in parallel ,based on the instance capacity.
4, Bigger redo-log files can enhance the write performance
5, Make innodb_flush_log_at_trx_commit=2 for faster write with a little compromise to durability.

 

via Planet MySQL
Database Objects migration to RDS/ Aurora (AWS)

We Built This Massive Lego Voltron So You Don’t Have To

Here at io9, we love any excuse to foster our inner children. For Gizmodo video producer Tom Caswell, that opportunity came with Lego’s new Classic Voltron Set. It’s 2,300 pieces of 1980s Lego perfection that we put together over the course of eight glorious hours. Of course, we know not everyone has the time for that—so check out our timelapse video construction of the almighty Voltron, which just went on sale today.

Lego is calling this Voltron set “the biggest buildable Lego mech ever.” It consists of 2,321 pieces that are used to create buildable and posable lions whicih you can play with individually or combine together to make Voltron—along with the giant sword and shield, of course. The finished Voltron is over 15 inches high and its look is inspired by the original 1980s animated Voltron TV show, though I’m sure you could also act out missions from DreamWorks’ Voltron: Legendary Defender series, currently on Netflix.

We’ll have a full review of the Lego Voltron soon, but in the meantime, you can see how this stackable sausage gets made with our time-lapse build. And if you did get your hands on a Lego Voltron of your own, comment with a photo of your build!

via Gizmodo
We Built This Massive Lego Voltron So You Don’t Have To

Judicial Overreach: The Internet Strikes Back

U.S. District Judge Robert Lasnik,in Washington state, issued a temporary injunction against Texas-based DefDist barring them from releasing hobbyist computer code files. Somehow, the ruling by one district judge is allegedly binding on the entire country.

The Internet, thanks to Code Is Free Speech, struck back.

Firearm-Related Speech, Machining Instructions, Codes Published by Civil Rights Organizations, Activists at New CodeIsFreeSpeech.com Website
SACRAMENTO, CA (July 31, 2018) — Tonight, the organizations and individuals behind CodeIsFreeSpeech.com, a new Web site for the publication and sharing of firearm-related speech, including machine code, have issued the following statement:

Our Constitution’s First Amendment secures the right of all people to engage in truthful speech, including by sharing information contained in books, paintings, and files. Indeed, freedom of speech is a bedrock principle of our United States and a cornerstone of our democratic Republic. Through CodeIsFreeSpeech.com, we intend to encourage people to consider new and different aspects of our nation’s marketplace of ideas – even if some government officials disagree with our views or dislike our content – because information is code, code is free speech, and free speech is freedom.

Should any tyrants wish to chill or infringe the rights of the People, we would welcome the opportunity to defend freedom whenever, wherever, and however necessary. Hand-waving and hyperbole are not compelling government interests and censorship is not proper tailoring under the law.

[READ MORE]

The plaintively-whining pisswit plaintiffs allege no standing. They can present no case of a crime committed with a 3D-printed hobbyist experiment. They don’t explain why lawfully printing a gun is worse than lawfully assembling a zipgun from Lowes-supplied pipe. They do — falsely — claim that such a home-built firearm is “undetectable;” the law has been clear on that for decades: firearms must incorporate a minimum mass of metal to render them detectable by by X-ray and metal detectors. It doesn’t matter if the firearm is machined by a big corporation, screwed together from pipes by a gangbanger, or printed by a law-abiding home hobbyist.

People have been hand-making firearms for nearly a millennium (commercial mass-production of firearms is a relatively new phenomenon), from materials far more appropriate to the pressures and stresses of a firearm than plastic.

Automated additive and subtractive manufacturing has been around for decades.

Plastic has been a structural element of firearms for decades.

Only now, has it become a “problem.” Because now individual have access to the technology. Not just the licensed, regulated, tracked, inspected, harassed commercial builders.

Very few — apparently none — street thugs are going to spend hundreds of dollars on a 3D printer, more on filament, download CAD files, download more software to convert .SLDPRT files to .STL, run the conversion, and spend hours or days printing a large, bulky, poorly concealable .380 with which to rob his drug dealer. Stealing a gun or buying a gun on the black market is faster, cheaper, and gets them more effective tools of crime. That’s not what has the authoritarian goons worried.

They are afraid of the law-abiding people, who are getting a little tired of laws with no discernible relation to the constitution; honest folks who want protection the cops can’t or won’t provide; good people who might bypass the State’s attempt to render them helpless crime targets (as the criminals already do).

The goons fear arms in the hands of citizens who are tired of their shit. They are so afraid of the people that they are trying to preemptively shut down a new technology before it’s even ready to produce effective arms.

That’s OK. I still have pipe, nails, and wood in the garage.

And the country still has the Firearms Policy Coalition, Firearms Policy Foundation, The Calguns Foundation, California Association of Federal Firearms Licensees, Cody Wilson, and hundreds or thousands of people generating and sharing printer files.



Take the 3D AR Challenge!
3D-print a fully-functional, plastic AR-15, and successfully demonstrate it. The first person to do so will win 10 rounds of equally functional, 100% plastic 3D-printed .223 Remington ammunition.



Carl is an unpaid TZP volunteer. If you found this post useful, please consider dropping something in his tip jar. He could use the money, what with truck repairs and recurring bills. Click here to donate via PayPal.

Facebooktwittergoogle_plusredditpinteresttumblrmail
via The Zelman Partisans
Judicial Overreach: The Internet Strikes Back

Nintendo got it right again

I worked Circuit City when the PlayStation 2 launched. For weeks, we were sold out, and there was always a crowd around the blue demo unit in the gaming department. It’s easy to see why the PlayStation 2 was a hit looking back. It was powerful, inventive and excelled at local gaming. It was the right system for the time.

If Nintendo’s recent success proves anything, building for the time is more important than making for the future.

Nintendo is coming off a massive quarter that saw 88% year over year operating profit on the back of the Nintendo Switch. The company has sold nearly 20 million Switch systems since its launch, surpassing the total amount of Wii U systems sold and closing in on Gamecube’s tally of 21.7 million units.

The Switch is great. I can’t get over how good it is. Again, like other systems before it, the Switch is the right system for the time. It’s portable, it’s small, and it leans heavily on cloud services. It’s not the most powerful system on the market nor does it pack 4k gaming or VR capabilities. The Switch doesn’t even have YouTube or Netflix. It’s a game system.

The Switch was a big bet for Nintendo. The company was coming off of the nascent Wii U, which besides Mario Kart 8 and Splatoon, was a game system without good games. It seemed Nintendo had lost its edge. The Wii U, in a way, was a trial for the Switch. It brought gaming off the TV and into the hands of gamers — but those gamers had to be in the same room as the Wii U base station. The Wii U didn’t go far enough in all sense of the phrase.

By the time the Switch came out, the looming threat of mobile games seemed to be over. A few years earlier, it appeared that the smartphone was going to take over and eat up the casual gaming market. Even Sony got in on the theme, releasing a hybrid smartphone and game system called the Xperia Play. While the smartphone game market is alive and thriving, it never gobbled up the home console market. The Xbox One and PlayStation 4 launched and gamers settled into the couch. The Switch offers something different and timely.

To state the obvious, the Switch is mobile, and that’s what’s needed in today’s environment. It’s different from the Xbox One and PlayStation 4 and in the best way possible. Like previous Nintendo products, the graphics are below the market average, and the capabilities are less than competitors. But that doesn’t matter. The Switch’s gaming experience, to some, is superior. I take my Switch on long flights. I can’t do that with a PlayStation 4.

Gamers agree. With nearly 20 million units sold since it launched in 2017, the Switch is nearing the sales amount of the Xbox One, which launched in 2013 and has sold between 25 and 30 million units. The PlayStation 4 is the clear winner of this generation of game systems, though, with nearly 80 million units sold — and an argument could be made that Sony built the Playstation 4 for today’s gamers too, bypassing all the extras Microsoft included in the Xbox One and instead focusing solely on games.

Nintendo has done this in the past, too. Think back to the Wii. It launched in 2006 and went on to sell over 100 million units. In 2006 Sony and Microsoft were pushing heavily into HD gaming with the PlayStation 3 and Xbox 360. And for a good reason, too. Consumers were heavily shopping for their first HDTV at the time, and Sony and Microsoft wanted to build a system for the future. Both the PS3 and Xbox 360 went on to long, healthy lives but they never saw the runaway success of the Wii.

The Wii was the must-have Christmas gift for 2006 and 2007. It was novel more than beautiful. Compared to the graphics of the PS3, the Wii looked childish. But that was part of the appeal. First generation gamers were aging and having families, and the Wii was built for all ages. Anyone could pick up a Wiimote and swing it around to hit the tennis ball. To many outside the core gaming crowd, the Wii was magical. It was the right system at the right time.

The next part seems to be the hardest for Nintendo. Now that the Switch is a success, Nintendo needs to maintain it by building and supporting a robust ecosystem of games. And Nintendo cannot be the source of all the best games. Nintendo must court developers and publishers and keep them engaged in the advantages of the Switch gaming system. If it can do that, the Switch has a chance to be a generational product like the Wii before it.


via TechCrunch
Nintendo got it right again

How to recover deleted files



Getty Images/iStockphoto

You need a document, photo or other file that you’re sure was deleted. You’ve searched your hard drive. You’ve scoured the Recycle Bin. No sign of it? Don’t panic. As long as you act quickly, you can usually bring that file back to life. And to accomplish that feat, you’ll want to turn to a recovery program to help you undelete it.

I’ve used and recommend three such applications: Recuva, EaseUS Data Recovery and Active Uneraser. With these programs, you can run a quick search for recently deleted files and conduct a more time-consuming but thorough scan to dig up older ones. You can scan external media, such as USB drives and SD cards, as well as your computer’s internal disk.

If the deleted file is one you’ve synced or stored in the cloud, you can typically undelete it as long as your cloud provider offers some type of recycle bin or trash folder. Popular services such as OneDrive, iCloud, Google Drive, Box and Dropbox all give you ways to resuscitate deleted files, but even here you need to act quickly. These services typically grant you up to 30 days to recover a file. After the clock has run out, those deleted files are purged and removed from their file servers.

If you want to revive a deleted file, an old adage applies: the sooner the better. When you delete a file in Windows, that file first bounces to the Recycle Bin. You can bypass the bin by turning it off through its Properties window or holding down the Shift key when you delete a file. Even if you use the Recycle Bin, at some point it will get too full and start kicking out older files. In other cases, you may decide to empty your bin to free up disk space. And that’s when the adventure begins.

When you permanently delete a file in Windows, it’s not physically removed from the disk. Rather, the file’s locations are marked as available by the file-allocation table. As such, the file still lives — unless and until you start storing new files that end up overwriting the deleted one. A file is stored in separate clusters of space on your hard drive. Some of a file’s clusters may become overwritten with new data while other clusters remain intact. In those cases, you may be able to recover parts of a file but not necessarily the whole thing.

Of course, going forward, you should always back up important documents and other files on a regular basis. In that case, you can retain deleted files on your backup source for as long as you want. But as far as repairing the damage that’s been done, these three apps do a good job recovering a deleted file from your PC.

Recuva

Recuva handles all types of deleted files, from documents to photos to videos to emails, and it can grab them from your hard drive, a removable drive or a USB stick. The program kicks off with a wizard that asks for the type and location of the file you want to restore. You can narrow the search this way or opt to look for all files in all locations. Recuva scans your drive to display a list of deleted files. You’ll see each file’s name, location, size, its chances for recovery and a comment with more details. After you select the file you want to restore, Recuva asks where to put it. Tip: If you hope to restore additional files from the same drive, save the recovered file in a different location to avoid overwriting any more clusters.

To cut to the chase, switch to advanced mode instead of using the wizard. There, you can select a location, pick a file type and enter a specific name or wildcard combination to limit the search. If your file doesn’t pop up, try a deep scan that digs for deleted files by analyzing each sector on the disk. But be prepared to wait: The deep scan took more than two hours to complete on my 2TB hard drive with 240GB of data.

I used Recuva to bring back deleted files from a hard drive, USB stick and SD card. I was able to successfully restore all files that were rated as “excellent” for recovery state. Files that were categorized as “poor” or “very poor” were either not recoverable at all or only partially recoverable, while those ranked as “unrecoverable” sadly never stood a chance. So we’re clear, a rating of excellent describes a freshly deleted file with no clusters overwritten. Poor or very poor refers to a file with few of its clusters intact. And unrecoverable points to an older deleted file with all of its clusters overwritten.

The basic version of Recuva is free; a $19.95 Pro edition works with virtual hard drives, provides automatic updates and delivers free premium support. There’s also a portable version you can run off a USB drive to avoid installing the software on your hard drive.

All told, Recuva works smoothly and efficiently. The wizard is simple to use, but be warned that it dumps so many deleted files into your lap that you might have a hard time locating the one you want. Instead, consider jumping straight to advanced mode, where you can exercise more control over what you see.

EaseUS Data Recovery

EaseUS Data Recovery, available for Windows and macOS, offers a variety of features and is available as both a free and paid product. You can restore files from internal and external hard drives, USB sticks, RAID configurations, SD cards, MP3 players, cameras, camcorders and more. EaseUS Data Recovery Wizard Free starts off by showing all of your hard drive partitions. Select a partition to scan for deleted files or choose a specific folder. Run a scan to begin the search; the program then displays a list of locations on your drive where it uncovered deleted files, arranged by folder or file type. Select a specific folder to see the files inside. You can narrow the list by opting to view only specific file types, such as graphics, audio, video, documents and emails. You can also search for files by name and wildcard symbols, such as an asterisk.

By default, the software shows you key details for each file, including the name, size, date, type and path. The program doesn’t indicate the recovery state of deleted files, but you can preview a deleted file to see if it’s intact.

While you’re hunting for your deleted file, EaseUS conducts a deep scan to seek out files that may not have been uncovered in the first scan. That process isn’t exactly speedy: On my drive, the deep scan took more than five hours to finish. The good news is that you can view the initial results of the deep scan while it’s running. After the scan, check the file or files you wish to restore, and the software will ask for a recovery location. Remember to choose a drive other than the source if you want to undelete additional files from the same spot. After the program has revived your chosen files, it opens the recovery folder so you can check out the results.

With EaseUS Data Recovery, I was able to restore all recently deleted files and mostly recover older deleted files as well as those on hidden or lost partitions. The Deep Scan was especially effective at restoring files that I thought I’d lost forever.

The free version of EaseUS Data Recovery poses one major obstacle: You can recover only up to 500MB of files at a time. By sharing a link to the application on Facebook, Twitter or Google+, though, you can increase that limit to 2GB. If you need to restore a larger file, however, you’ll have to pony up for one of the paid editions. Priced at $69.95, Data Recovery Wizard Pro can undelete any size file. For $99.90, Data Recovery Wizard Pro+WinPE offers a bootable media option in case your hard drive ever goes belly up. And if you recover files and hard drives for a living, paying $299 per year (or $499 for a lifetime subscription) scores you the Data Recovery Wizard Technician version. All three paid editions grant you free lifetime upgrades and free technical support.

Active Uneraser

Active Uneraser has several tricks up its sleeve. You can start with the free version, which is plenty powerful in its own right. You can recover files from your hard drive, external drives, USB sticks and SD cards. You can undelete damaged partitions. The software also supports RAID configurations. Active Uneraser kicks off by displaying your hard drive partitions, even ones that have been deleted. Select a specific partition and the program provides plenty of details, such as the total capacity, used space, free space, file system and condition.

After you scan a partition, Active Undelete displays all the files contained within. You can switch the view among all files, existing files and deleted files. The files are arranged by folder to allow for quick and easy searching. You can always search for a deleted file by name and/or wildcards. If the initial QuickScan comes up empty, try the QuickScan Plus feature to detect more lost or damaged files or folders. Next in line, a SuperScan digs deeper but takes longer to find deleted files. If those methods don’t do the trick, turn to the Last Chance option, which tries to uncover files based on their signatures, which are used to identify their format.

You can preview certain types of deleted files, but the software limits your view to files 10MB or smaller. To bring back a file, select it and run the Unerase command. Active Undelete asks for a location to restore the file and then opens File Explorer or Windows Explorer to display the recovery folder.

I was able to restore all recently deleted files from a hard drive, USB stick and SD card. SuperScan took four hours to run while Last Chance ran for six hours; both were able to find and revive older files as well.

The free version comes with one small restriction: You can recover just one file at a time. To get past this limitation and access other features, upgrade to one of the two paid versions. For $39.99, the Professional edition adds a bootable Windows Recovery environment in case your PC can’t boot up. For $49.99, the Ultimate edition kicks in a Linux recovery CD and the ability to repair or restore damaged RAID configurations.

The best recovery program

Recuva, EaseUS Data Recovery and Active Uneraser all work smoothly and effectively to recover your deleted files. If you’re seeking a free tool, try Recuva. It works well and isn’t saddled with the limitations imposed by the free flavors of the other two programs. If you don’t mind spending a few bucks, check out the Professional edition of Active Uneraser, as it’s reasonably priced, offers three different levels of scans, and kicks in the bootable recovery environment.


via Engadget
How to recover deleted files