Encrypt and Decrypt Eloquent Model Fields in Laravel Apps

https://laravelnews.imgix.net/images/laravel-ciphersweet.png?ixlib=php-3.3.1

Laravel Ciphersweet is a package by Spatie to integrate searchable field-level encryption in Laravel applications. The package’s readme explains the problem Ciphersweet can help solve as follows:

In your project, you might store sensitive personal data in your database. Should an unauthorised person get access to your DB, all sensitive can be read which is obviously not good.

To solve this problem, you can encrypt the personal data. This way, unauthorized persons cannot read it, but your application can still decrypt it when you need to display or work with the data.

This package is a wrapper for Ciphersweet to integrate its features into Laravel models easily. Here’s an example of a model from the readme’s setup instructions that illustrates what a model looks like using Ciphersweet:

1use Spatie\LaravelCipherSweet\Contracts\CipherSweetEncrypted;

2use Spatie\LaravelCipherSweet\Concerns\UsesCipherSweet;

3use ParagonIE\CipherSweet\EncryptedRow;

4use Illuminate\Database\Eloquent\Model;

5 

6class User extends Model implements CipherSweetEncrypted

7{

8 use UsesCipherSweet;

9 

10 public static function configureCipherSweet(EncryptedRow $encryptedRow): void

11 {

12 $encryptedRow

13 ->addField('email')

14 ->addBlindIndex('email', new BlindIndex('email_index'));

15 }

16}

This allows you the encrypt a user’s email to keep it safe from unauthorized people reading the data, but give you the ability to decrypt the data to display it or work with it.

Once you have configured this package and set up a model, you can search encrypted data in the database using blind indexes:

1$user = User::whereBlind('email', 'email_index', 'rias@spatie.be');

This package also aids in generating encrypting keys and encrypting model attributes to speed up integration with Ciphersweet.

I want to point out that you should not use this package blindly without understanding the ins and outs of the use case you are trying to solve. You can learn more about CipherSweet on this page, which has many linked resources.

CipherSweet also has PHP-specific documentation to help get you up to speed with the underlying PHP package.

I would also recommend reading Rias’ post, Encrypting Laravel Eloquent models with CipherSweet.

To get started with this package, check it out on GitHub at spatie/laravel-ciphersweet.

Laravel News

The FBI Said This is the Best Handgun (And Why They’re Wrong)

In response to numerous inquiries from local law enforcement departments, the FBI undertook a comprehensive evaluation of the sidearms available in 1987 (most of which are still made today) to determine which pistol was the best. Thirteen of the most talented instructors that the Federal Bureau of Investigation had to offer all met at a […]

Read More …

The post The FBI Said This is the Best Handgun (And Why They’re Wrong) appeared first on The Firearm Blog.

The Firearm Blog

Darktable celebrates its 10-year anniversary with substantial version 4.0.0 update

https://3.img-dpreview.com/files/p/E~C0x0S1404x1053T1200x900~articles/6340644378/screenshot_lighttable.jpeg

Just about a year after releasing darktable 3.6, the darktable team has announced darktable 4.0.0. The update celebrates 10 years of darktable offering photographers open source raw image editing. The major release adds many new features to the open source photography workflow app and raw image editor, including color and exposure mapping, filmic v6, guided Laplacian highlight reconstruction, a new perceptually uniform color space, revamped user interface, performance improvements, and much more.

Color and exposure mapping comprise a new feature in the ‘exposure’ and ‘color calibration’ modules that allows you to define and save a specified target color/exposure for the color pickers. You can match any source object against an arbitrary target color. You can use this tool to perform white balance adjustments against non-gray objects of known color, or ensure consistent color across a batch of images.

Filmic v6 includes a new color science. Darktable writes, ‘This change removes the mandatory desaturation close to medium white and black and replaces it with a true gamut mapping against the output (or export) color space. This allows for more saturated colors, notably in blue skies.’ Darktable 4.0.0 now includes a ‘fully-sanitized color pipeline’ from input (color calibration), creative changes (color balance RGB) and through to output (filmic v6).

Within the ‘highlight reconstruction’ module is a new ‘guided Laplacian’ method. This uses a ‘multi-scale wavelet scheme to extract valid details from non-clipped RGB channel(s)’ and ‘propagates the color gradients from neighboring valid regions using edge-aware color diffusion.’ The team writes that this feature promises to limit color bleeding through edges, such as green leaves bleeding color into a reconstructed blue sky. This method is only available for images captured with a Bayer sensor, so Fujifilm X-Trans users are out of luck here.

Darktable 4 introduces the darktable Uniform Color Space 2022 (darktable UCS 22). It’s a perceptually uniform color space built using psychoperceptual experimental data that was gathered for artistic saturation changes. What’s this actually mean? Darktable UCS 22 ‘ uses a brightness-saturation scheme that compensates for the Helmholtz-Kohlraush effect (accounting for the contribution of colorfulness in perceived brightness) and allows an efficient gamut-mapping against pipeline RGB at a constant brightness. It will make the saturation control in color balance RGB better behaved.’ You can learn a bit more about the Helmholtz-Kohlraush effect in the latter half of the video below.

The user interface has been completely revamped to improve the overall look and consistency. Padding, margins, color, contrast, alignment and icons have been reworked throughout the application. Collapsible sections within modules have been redesigned to improve functionality, plus channel mixer RGB, exposure and color calibration modules include new collapsible sections. The vignetting module has been split into two sections. Superfluous sections have been removed in ‘crop’ and ‘white balance’ tools. The default theme is now Elegant Gray, which is the recommended choice of the darktable team.

The app’s performance and OpenCL settings have been optimized, so performance is more tunable by the user and should be improved overall. There are many more other changes, including a color glossary, new contrast parameters, a new ‘collection filters’ module, improved search, improved export options, improved shortcuts when using sliders, a new raw exposure function, and more. Plus, there are many new bug fixes in the latest update. For the full details, visit darktable.

Darktable 4.0.0 is available now for Linux, macOS and Windows. At that link, you can also download the software’s source code. If you would like to give darktable 4.0.0 a try but don’t know where to start, there’s a very detailed user manual available here.

Articles: Digital Photography Review (dpreview.com)

This is the most based pro-gun campaign ad I’ve ever seen

https://media.notthebee.com/articles/62c5f66420cd962c5f66420cda.jpg

Former ASU football standout and inspirational speaker Jerone Davison is running for Congress on the Republican ticket in Arizona. So far, it looks like Mr. Davison will be pulling ZERO punches in this fight.

Not the Bee

4 Errors You Should Avoid While Handling Money With PHP

https://dev.to/social_previews/article/1119335.png



1. Not knowing which datatype to use in MySQL

I once heard it’s better to use integers when handling financial data. You convert a price like €10 to its lowest unit (cents in this case). This way you end up using 1000 as the amount to work with. This way you avoid the floating-point problem. The floating-point problem is best shown by typing the following in your Google Chrome console:

0.1 + 0.2 > 0.30000000000000004

If you want to learn more about this problem visit this website. Working with integers is dramatic for readability (how much is 13310 in euros?). The disadvantage of working with integers is also that it has a limit of 2147483647 which is roughly € 21,474,836.47. Although with the euro you probably wouldn’t run into this issue quickly but with the Vietnamese Dong, this wouldn’t work. Learnings: use decimals (not floats!) in MySQL to store monetary values. Depending on how many decimals you need decimal(15,2) oftentimes is enough.



2. Not having something to fact-check the numbers

Imagine we have a shopping cart where there’s 1 product for € 100, the VAT of € 21 and a total of € 131. The first time you’re sharp and you immediately see your mistake. After the 100th time, you start to be blind to those mistakes.

That’s why you need something to fact-check the numbers if they’re correct. I’ve created a Google Sheet for me and my team where we can all fact-check this. Especially if you work with people who test your product but don’t have access to the code this is crucial. How should they know if the price displayed is the correct one?



3. Not splitting the price into all the components

Every part of a price should be stored separately. If not, there’s no way to reproduce the components if you need to later on. So save the VAT amount, the discount amount, the base price, and the total all separately. Big chance there are gonna be more price components in your app in the future.



4. Using foreign keys in the ‘orders’ table

One of my dumbest mistakes. I had an ‘orders’ table where all the orders of an e-commerce store were placed. Unfortunately, it had a reference to the actual products which I got the product price from. Everything was fine until one of the product prices changed and older orders were affected by it😅

I’ve made many mistakes even though I have been developing applications for years. But without resistance, there’s no growth, so I tend to share my mistakes so you might prevent them.

I’m planning on writing an ebook on developing applications where you work with money. If you’re interested you might wanna subscribe to get free access to the first chapter.

Subscribe here

Laravel News Links

How to Explore Datasets in Go

https://static1.makeuseofimages.com/wordpress/wp-content/uploads/2022/06/photo-of-data-charts.jpg

To analyze a dataset, you first need to understand the data. Sometimes, you might have no forehand knowledge of a dataset, preventing you from getting the most out of it. As a data analyst, you can use Exploratory data analysis (EDA) to gain knowledge of your dataset before in-depth analysis.

Exploratory data analysis (EDA) investigates a dataset to gain meaningful insights. The process of performing EDA involves querying information about the structure and contents of a dataset.

MAKEUSEOF VIDEO OF THE DAY

Installing the Gota Package

The Gota package is the most popular for data analysis in Go; it’s like the Python Pandas package but for Go. The Gota package contains many methods for analyzing datasets and reading JSON, CSV, and HTML formats.

Run this command on your terminal in the directory where you have initialized a Go module file:

go get -u github.com/go-gota/gota

The command will install Gota in the local directory, ready for you to import the package to use it.

Just like Pandas, Gota supports series and dataframes operations. There are two sub-packages in the Gota package: the series, and the dataframe package. You can import either one or both, depending on your needs.

import (
"github.com/go-gota/gota/series"
"github.com/go-gota/gota/dataframe"
)

Reading a Dataset Using the Gota Package

You can use any CSV file you like, but the following examples show results from a Kaggle dataset, containing laptop price data.

Gota lets you read CSV, JSON, and HTML file formats to create dataframes using the ReadCSV, ReadJSON, and ReadHTML methods. Here’s how you load a CSV file into a dataframe object:

file, err := os.Open("/path/to/csv-file.csv")

if err != nil {
fmt.Println("file open error")
}

dataFrame := dataframe.ReadCSV(file)
fmt.Println(dataFrame)

You can use the Open method of the os package to open a CSV file. The ReadCSV method reads the file object and returns a dataframe object.

When you print this object, the output is in a tabular format. You can further manipulate the dataframe object using the various methods Gota provides.

The object will only print some of the columns if a dataset has more than a set value.

Fetching the Dimension of the Dataset

The dimensions of a dataframe are the number of rows and columns it contains. You can fetch these dimensions using the Dims method of the dataframe object.

var rows, columns = dataFrame.Dims()

Replace one of the variables with an underscore to fetch the other dimension only. You can also query the number of rows and columns individually, using the Nrow and Ncol methods.

var rows = dataFrame.Nrow()
var columns = dataFrame.Ncol()

Fetching the Data Types of Columns

You’ll need to know the composite data types in a dataset’s columns to analyze it. You can fetch these using the Types method of your dataframe object:

var types = dataFrame.Types()
fmt.Println(types)

The Types method returns a slice containing the column’s data types:

Fetching the Column Names

You’ll need the column names to select specific columns for operations. You can use the Names method to fetch them.

var columnNames := dataFrame.Names()
fmt.Println(columnNames)

The Names method returns a slice of the column names.

Checking for Missing Values

You might have a dataset that contains null or non-numeric values. You can check for such values using the HasNaN and IsNaN methods of a series object:

aCol := dataFrame.Col("display_size")
var hasNull = aCol.HasNaN()
var isNotNumber = aCol.IsNaN()

HasNan checks if a column contains null elements. IsNaN returns a slice of booleans representing whether each value in the column is a number.

Performing Descriptive Statistical Analysis

Descriptive statistical analysis helps you understand the distribution of numerical columns. Using the Describe method, you can generate a descriptive statistical analysis of your dataset:

description := dataFrame.Describe()
fmt.Println(description)

The Describe method returns metrics like the mean, standard deviation, and maximum values of columns in a dataset. It summarizes these in a tabular format.

You can also be specific and focus on columns and metrics by selecting a particular column, then querying for the metric you want. You should first fetch the series representing a specific column, then use its methods like so:

aCol := dataFrame.Col("display_size")
var mean = aCol.Mean()
var median = aCol.Median()
var minimum = aCol.Min()
var standardDeviation = aCol.StdDev()
var maximum = aCol.Max()
var quantiles25 = aCol.Quantile(25.0)

These methods mirror the results from the descriptive statistical analysis that Describe performs.

Fetching the Elements in a Column

One of the final tasks you’ll want to perform is to check the values in a column for a general overview. You can use the Records method to view the values of a column.

aCol := dataFrame.Col("brand")
fmt.Println(aCol.Records())

This method returns a slice of strings containing the values in your selected column:

Exporting a Gota Dataframe to a File

If you choose to go further and use the Gota package for full data analysis, you’ll need to save data in files. You can use the WriteCSV and WriteJSON methods of dataframe to export files. The methods take in a file that you’ll create using the os package’s Create method.

Here’s how you can export a dataframe using the Gota package.

dataFrame := dataframe.ReadCSV(file)
outputFile, err := os.Create("output.csv")

if err != nil {
log.Fatal(err)
}

err = dataFrame.WriteCSV(outputFile)

if err != nil {
log.Fatalln("There was an error writing the dataframe contents to the file")
}

The dataFrame variable is a representation of the dataframe. When you use the Create method of the os package, it creates a new, empty file with the specified name and returns the file. The WriteCSV method takes in the file instance and returns an error or nil if there’s no error.

Exploratory Data Analysis Is Important

An understanding of data and datasets is essential for data analysts and machine learning specialists. It is a critical operation in their work cycle, and exploratory data analysis is one of the techniques they use to achieve that.

There’s more to the Gota package. You can use it for various data wrangling functions in the same way that you’d use the Python Pandas library for data analysis. However, Gota doesn’t support quite as much functionality as Pandas.

MUO – Feed

UK Man Vomits for Months, Ends Up in Hospital After Vitamin D Overdose

https://i.kinja-img.com/gawker-media/image/upload/c_fill,f_auto,fl_progressive,g_center,h_675,pg_1,q_80,w_1200/9ab455ae991773702287f5dcfccd81c3.jpg

Image: Shutterstock (Shutterstock)

Doctors in the UK say a man’s intense supplement regimen landed him in the hospital with vitamin D poisoning. In a new case report, they detail how their patient became sick soon after he started to take large doses of vitamins and minerals. Though vitamin D overdose is uncommon, the study authors say, cases seem to be on the rise globally.

According to the paper, published in BMJ Case Reports, the middle-aged man was referred to an emergency department by his general practitioner. For nearly three months, he had been dealing with a variety of ongoing symptoms, including vomiting, diarrhea, abdominal pain, dry mouth, tinnitus, and leg cramps; he had also lost almost 30 pounds. Tests soon ruled out other potential causes of his illness, such as infection. But they revealed evidence of acute kidney injury, as well as much higher levels of vitamin D and calcium (a common sign of vitamin D overdose) than normal in his system.

The man reported that his symptoms began about a month after he decided to take a lengthy list of supplements on the advice of a private nutritionist. But his allotted doses were far higher than the daily recommended amount. He reportedly took 50,000 milligrams of vitamin D, for instance, or almost 100 times the 600 milligrams a day we should be getting. (Other supplements included vitamin K2, vitamin C, vitamin B9, omega 3s, zinc, and magnesium). The man said he did stop taking the cocktail after his symptoms appeared, but they continued nonetheless.

Ultimately, he was placed on intravenous fluids and hospitalized for eight days, with doctors monitoring his blood every day to ensure that he was improving. He was also given counseling and drugs known as bisphosphonates to manage his high calcium levels during and after his hospitalization. Vitamin D is fat-soluble, meaning that it gets absorbed into the body’s fatty tissues and doesn’t quickly dissipate. Two months after his hospital stay, follow-up tests showed that his levels of calcium had returned to near-normal, but not his levels of vitamin D.

People naturally get vitamin D from food or from regular exposure to sunlight. And while there is some evidence that many people may have insufficient vitamin D levels, there’s no evidence that taking megadoses of vitamin D or other supplements will improve health. On the other hand, vitamin D intoxication, or hypervitaminosis D, is almost always linked to improper supplementation. Data is sparse on how often it happens, but a 2016 analysis of U.S. poison control data found over 25,000 reports related to vitamin D documented between 2000 and 2014. Most reports of illness were mild to moderate, with no related deaths, but exposures did seem to become more common over time—a trend noted by the current study authors.

G/O Media may get a commission

UNDER $1

99¢ Prime Video Channels

Prime content
Add Showtime, Starz, Paramount+, Discovery, and more to your Prime Video account for less than $1 each for the first two months of your subscription.

“Globally, there is a growing trend of hypervitaminosis D, a clinical condition characterized by elevated serum vitamin D3 levels,” they wrote, adding that cases are more common in women, children, and surgical patients.

Supplements can be useful for certain groups, such as people with clear nutritional deficiencies or pregnant people who need extra folic acid. But many doctors remain skeptical of their use for the average person. Indeed, an influential panel of experts recently recommended against taking vitamin E or beta-carotene supplements to prevent cancer or heart disease, citing the lack of good evidence for their benefits as well as some evidence that they can actually be harmful. Vitamin D overdoses like this current case aren’t common, but it is yet another example of why supplements aren’t quite as useful or harmless as commonly believed.

“This case report further highlights the potential toxicity of supplements that are largely considered safe until taken in unsafe amounts or in unsafe combinations,” the authors wrote.

Gizmodo

Video: A rare, unique view of fireworks launching and igniting from a barge

https://3.img-dpreview.com/files/p/E~C111x0S1779x1334T1200x900~articles/5007787721/Fireworks.jpeg

Every year, the United States celebrates its birthday on July 4th. In practically any town, large or small, fireworks are launched in celebration of the country’s independence. While some displays are captivating, there are many more amateur efforts in play. Furthermore, most people tend to capture them with a smartphone.

Luckily, New Hampshire native Ron Risman has neighbors with a pyrotechnic expert in the family. He was given the go-ahead to place a Go Pro HERO8 camera between explosives on the barge so it could capture them, launching and igniting, from a unique angle. Titled ‘Finale,’ the three-and-a half minute clip provides the point of view as if you were sitting in the thick of it.

A Go Pro HERO8 camera (bottom-left) was carefully mounted on a barge to capture the fireworks being launched into the sky.

‘I felt this could be an interesting perspective as long as the GoPro was able to expose properly for the fireworks. I set the Go Pro HERO8 to record in 4K/60p resolution with a wide lens and 2-stops underexposed. The exposure setting worked out perfectly to capture the color and beauty of the explosions,’ Risman tells DPReview.

To stabilize the camera for smoother footage, the GoPro was placed into a an X-PWR H8 cage that was mounted to a magic arm. That arm was then mounted to a Platypod Pro to give it additional surface and stability. This was all nailed to temporary plywood on the dock so that it wouldn’t move when nearby explosives ignited.

The GoPro was connected to an external battery pack using a 3BR Powersports X-PWR H8 external power cable. The only shortcoming in the planning process came down to a matter of time: a 128GB microSD card can only record for two-and-a-half hours at 4K/60p.

The Go Pro HERO8 survived the launch site on the barge (pictured).

Recording began at 7:20 pm, as the barge started floating out on the lake. The action concluded around 9:20 pm, meaning he was able to capture everything he wanted last night. ‘By the end of the show I had no idea whether the camera survived or whether or not the camera was still recording, but fortunately everything went as planned and the footage captured was as spectacular as I had hoped it would be,’ Risman adds.

Risman edited the footage using Adobe’s Premiere Pro on his MacBook Pro. He watched the footage at both regular speed and 4X slow motion before settling on an edit that highlights the latter for more of an impact. The soundtrack ‘Fractured Time’ by Cody Martin was licensed via Soundstripe. All in all, this is truly a fitting ‘finale’ for a long Holiday weekend.

Articles: Digital Photography Review (dpreview.com)