https://miro.medium.com/v2/resize:fit:1200/1*3cp7CgW1ePqLM-Ct4174jQ.jpegLearn how to create your own custom Service Providers in Laravel and register them with your application. This guide includes real-life examples and code snippets to help you get started.Laravel News Links
Laravel Filament: How To Upload Video Files
https://laraveldaily.com/storage/393/Untitled-design—2023-04-27T113003.756.png
Filament admin panel has a File Upload field, but is it possible to upload video files with it? In this tutorial, I will demonstrate that and show the uploaded video in a custom view page using basic HTML <video>
Element.
Prepare Server for Large File Uploads
Before touching any code, first, we will prepare the web-server to be able to upload larger files, in the php.ini
file settings.
The default value for upload_max_filesize
is 2 MB and 8 MB for post_max_size
. We need to increase those values.
I’m using PHP 8.1. If yours is different, change the version to yours.
sudo nano /etc/php/8.1/fpm/php.ini
I will be uploading a 17MB video file so I will increase both upload_max_filesize
and post_max_size
to 20MB.
post_max_size = 20M
upload_max_filesize = 20M
Next, restart the PHP FPM service.
sudo service php8.1-fpm restart
Now, our PHP is ready to accept such files.
Uploading File
On the DB level, we will have one model Video
with two string fields attachment
and type
.
app/Models/Video.php:
class Video extends Model
{
protected $fillable = [
'attachment',
'type',
];
}
Next, Filament. When creating a Filament Resource, we also need to create a record view.
php artisan make:filament-resource Video --view
For the form, we will have a basic File Upload field. This field will be required, will have a max upload size of 20MB, and I will preserve the original filename. The last one is optional.
app/Filament/VideoResource.php:
class VideoResource extends Resource
{
// ...
public static function form(Form $form): Form
{
return $form
->schema([
FileUpload::make('attachment')
->required()
->preserveFilenames()
->maxSize(20000),
]);
}
// ...
}
Before uploading, we also need to set the max file size for the Livewire. First, we need to publish Livewire config.
php artisan livewire:publish --config
config/livewire.php:
return [
// ...
'temporary_file_upload' => [
'disk' => null, // Example: 'local', 's3' Default: 'default'
'rules' => 'max:20000',
'directory' => null, // Example: 'tmp' Default 'livewire-tmp'
'middleware' => null, // Example: 'throttle:5,1' Default: 'throttle:60,1'
'preview_mimes' => [ // Supported file types for temporary pre-signed file URLs.
'png', 'gif', 'bmp', 'svg', 'wav', 'mp4',
'mov', 'avi', 'wmv', 'mp3', 'm4a',
'jpg', 'jpeg', 'mpga', 'webp', 'wma',
],
'max_upload_time' => 5, // Max duration (in minutes) before an upload gets invalidated.
],
// ...
];
Now upload should be working. But before creating the record, we need to get the mime type of the file and save it into the DB.
app/Filament/VideoResouce/Pages/CreateVideo.php:
<?php
namespace App\Filament\Resources\VideoResource\Pages;
use Illuminate\Support\Facades\Storage;
use App\Filament\Resources\VideoResource;
use Filament\Resources\Pages\CreateRecord;
class CreateVideo extends CreateRecord
{
protected static string $resource = VideoResource::class;
protected function mutateFormDataBeforeCreate(array $data): array
{
$data['type'] = Storage::disk('public')->mimeType($data['attachment']);
return $data;
}
}
Viewing Video
To view the video, we will use a basic HTML <video>
tag. For this, in Filament we will need to make a basic custom view page.
First, let’s add the ViewRecord
page custom view path.
app/Filament/Resources/VideoRecourse/Pages/ViewVideo.php:
class ViewVideo extends ViewRecord
{
protected static string $resource = VideoResource::class;
protected static string $view = 'filament.pages.view-video';
}
Now let’s create this view file and add a video player to it.
resources/views/filament/pages/view-video.blade.php:
<x-filament::page>
<video controls>
<source src="" type="">
Your browser does not support the video tag.
</video>
</x-filament::page>
After visiting the view page you will your uploaded video in the native browser video player.
That’s it! As you can see, video files aren’t different than any other files, they just need to have different validation for larger files.
You can learn more tips on how to work with Filament, in my 2-hour course Laravel Filament Admin: Practical Course.
Laravel News Links
New ‘Double Dragon’ game trailer promises nostalgic beat-em-up thrills
http://img.youtube.com/vi/TVvlBr6hVXA/0.jpg
The original Double Dragon basically invented co-op beat-em-up action in 1987, and now modern players are about to get a dose of nostalgic side-scrolling goodness thanks to a new franchise installment. Double Dragon Gaiden: Rise of the Dragons launches this fall for every major platform, including PC, Xbox consoles, PlayStation 4 and 5 and the Nintendo Switch.
What to expect from this installment? The trailer suggests a return to the tried-and-true beat-em-up formula. There’s a nice retro pixelated art style, 13 playable characters to choose from and, of course, two-player local co-op. The new title also includes a tag-team ability, so you actually play as two characters at once.
Developer Modus Games is teasing some roguelite elements, like a dynamic mission select feature that randomizes stage length, enemy number and difficulty. This is also a 2023 console game and not an arcade machine from the 1980s, so expect purchasable upgrades and some light RPG mechanics.
As for the plot, the years haven’t been kind to series protagonists Jimmy and Billy Lee. The sequel finds New York City devastated by nuclear war, which leads to gangs of hooligans roaming the radioactive streets. You know what happens next (you beat them up). It remains to be seen if your avatars can beat up that long nuclear winter.
Modus Games isn’t a well-known developer but it has plenty of well-regarded indie titles under its belt, like Afterimage and Teslagrad 2. The trailer looks cool, so this is worth keeping an eye on, especially given that there hasn’t been a Double Dragon game since the long ago days of 2016.
This article originally appeared on Engadget at https://www.engadget.com/new-double-dragon-game-trailer-promises-nostalgic-beat-em-up-thrills-175831891.html?src=rssEngadget
Chuck Norris in a tank crushing cars while firing uzis? What kind of church is this??
https://media.notthebee.com/articles/645258f685e7f645258f685e80.jpg
Last weekend the Satanists were shredding Bibles for the glory of Satan at SatanCon in Massachusetts.
Not the Bee
‘Star Trek’ Fans Can Now Virtually Tour Every Starship Enterprise Bridge
A new web portal allows "Star Trek" fans to explore the iconic bridge of the starship Enterprise through 360-degree, 3D models and learn about its evolution throughout the franchise’s history. Smithsonian Magazine reports: The site features 360-degree, 3D models of the various versions of the Enterprise, as well as a timeline of the ship’s evolution throughout the franchise’s history. Fans of the show can also read detailed information about each version of the ship’s design, its significance to the "Star Trek" storyline and its production backstory. Developed in honor of the "Star Trek: Picard" series finale, which dropped late last month on Paramount+, the portal is a collaboration between the Roddenberry Estate, the Roddenberry Archive and the technology company OTOY. A group of well-known "Star Trek" artists — including Denise and Michael Okuda, Daren Dochterman, Doug Drexler and Dave Blass — also supported the project.
The voice of the late actress Majel Roddenberry, who played the Enterprise’s computer for years, will be added to the site in the future. Gene Roddenberry died in 1991, followed by Majel Roddenberry in 2008; the two had been married since 1969. The portal’s creators also released a short video, narrated by actor John de Lancie, exploring every version of the Enterprise’s bridge to date, "from its inception in Pato Guzman’s 1964 sketches, through its portrayal across decades of TV shows and feature films, to its latest incarnation on the Enterprise-G, as revealed in the final episode of ‘Star Trek: Picard,’" per the video description. Accompanying video interviews with "Star Trek" cast and crew — including William Shatner, who played Captain Kirk in the original series, and Terry Matalas, a showrunner for "Star Trek: Picard" — also explore the series’ legacy.
Read more of this story at Slashdot.
Slashdot
Laravel analytics – how and why I made my own analytics package
https://www.danielwerner.dev/assets/img/home-bg.jpg
I’ve used google analytics for couple of years and it worked quite well for me. The question arises why did I make my own analytics package. There were a couple of reasons to do so:
- Google analytics became quite complex and slow lately. Especially with introduction of the new Google Analytics 4 it became more complex, and I realised that I don’t use even 0.1 percent of its capability. This blog and the other websites I developed as side projects only need simple things like visitor count per day in a specific period, and page views for the top visited pages. That’s it!
- I wanted to get rid of the third party cookies as much as possible
- Third party analytics tools are mostly blocked by ad blockers, so I see lower numbers than the real visitors.
Requirements
- It needs to be a Laravel package, as I want to use it in couple of projects
- Keep it simple, only the basic functionality
- Track page visits by uri, and also the relevant model ids if applicable (for example blog post id, or product id)
- Save UserAgents for possible further analysis of visitor devices (desktop vs mobile) and to filter out bot traffic
- Save IP address for a planned feature: segment users by countries and cities
- “In house” solution, track the data in the applications own database
- Only backend functionality for tracking, no frontend tracking
- Create chart for visitors in the last 28 days, and most visited pages in the same period
- Build the MVP and push back any optional features, like
- Aggregate the data into separate table instead of querying the page_views table (I’ll build it when the queries become slow)
- Add geoip databse, and save the user’s country and city based on their IP
- Add possibility to change the time period shown on the charts
The database
As I mentioned earlier the goal was to keep the whole thing very simple, so the database only consits of one table called laravel_analytics_page_views where the larave_analytics_ prefix is configurable in the config file to prevent potential conflicts with the app’s databses tables.
The schema structure/migration looks like this:
$tableName = config('laravel-analytics.db_prefix') . 'page_views';
Schema::create($tableName, function (Blueprint $table) {
$table->id();
$table->string('session_id')->index();
$table->string('path')->index();
$table->string('user_agent')->nullable();
$table->string('ip')->nullable();
$table->string('referer')->nullable()->index();
$table->string('county')->nullable()->index();
$table->string('city')->nullable();
$table->string('page_model_type')->nullable();
$table->string('page_model_id')->nullable();
$table->timestamp('created_at')->nullable()->index();
$table->timestamp('updated_at')->nullable();
$table->index(['page_model_type', 'page_model_id']);
});
We track the unique visitors by session_id, which is of course not perfect and not 100% accurate but it does the job.
We create a polymorpthic relation with page_model_type and page_model_id if there is a relevant model to the tracked page we save the type and the id to use in the future if necessary. Also created a combined index for these 2 fiels, as they are mostly queried together when using polymorphic relations.
The middleware
I wanted and universal solution rather than adding the analytics to all the controllers created a middleware which can handle the tracking. The middleware can be added to all routes or to specific group(s) of routes.
The middleware itself is quite simple, it tracks only the get requests and skips the ajax calls. As it doesn’t make sense to track bot traffic, I used the https://github.com/JayBizzle/Crawler-Detect package to detect the crawlers and bots. When a crawler is detected it simply skips the tracking, this way we can avoid having useless data in the table.
It was somewhat tricky how to get the associated model for the url in an universal way. The solution at the end is not totally universal because it assumes that the app uses route model binding and assumes that the first binding is relevant to that page. Again it is not perfect but it fits the minimalistic approach I followed while developing this package.
Here is the code of the middleware:
public function handle(Request $request, Closure $next)
{
$response = $next($request);
try {
if (!$request->isMethod('GET')) {
return $response;
}
if ($request->isJson()) {
return $response;
}
$userAgent = $request->userAgent();
if (is_null($userAgent)) {
return $response;
}
/** @var CrawlerDetect $crawlerDetect */
$crawlerDetect = app(CrawlerDetect::class);
if ($crawlerDetect->isCrawler($userAgent)) {
return $response;
}
/** @var PageView $pageView */
$pageView = PageView::make([
'session_id' => session()->getId(),
'path' => $request->path(),
'user_agent' => Str::substr($userAgent, 0, 255),
'ip' => $request->ip(),
'referer' => $request->headers->get('referer'),
]);
$parameters = $request->route()?->parameters();
$model = null;
if (!is_null($parameters)) {
$model = reset($parameters);
}
if (is_a($model, Model::class)) {
$pageView->pageModel()->associate($model);
}
$pageView->save();
return $response;
} catch (Throwable $e) {
report($e);
return $response;
}
}
The routes
When developing Laravel packages it is possible to set up the package service provider to tell the application to use the routes from the package. I usually don’t use this approach, because this way in the application you don’t have much control over the routes: for example you cannot add prefix, put them in an group or add middleware to them.
I like to create a class with a static method routes, where I define all the routes.
public static function routes()
{
Route::get(
'analytics/page-views-per-days',
[AnalyticsController::class, 'getPageViewsPerDays']
);
Route::get(
'analytics/page-views-per-path',
[AnalyticsController::class, 'getPageViewsPerPaths']
);
}
This way I could easily put the package routes under the /admin part in my application for example.
The frontend components
The frontend part consists of 2 vue components one for the visitor chart and one contains a simple table of the most visited pages. For the chart I used the Vue chartjs library (https://github.com/apertureless/vue-chartjs)
<template> <div> <div><strong>Visitors: </strong></div> <div> <LineChartGenerator :chart-options="chartOptions" :chart-data="chartData" :chart-id="chartId" :dataset-id-key="datasetIdKey" :plugins="plugins" :css-classes="cssClasses" :styles="styles" :width="width" :height="height" /> </div> </div> </template> <script>
import { Line as LineChartGenerator } from 'vue-chartjs/legacy' import { Chart as ChartJS, Title, Tooltip, Legend, LineElement, LinearScale, CategoryScale, PointElement } from 'chart.js' ChartJS.register( Title, Tooltip, Legend, LineElement, LinearScale, CategoryScale, PointElement ) export default { name: 'VisitorsPerDays', components: { LineChartGenerator }, props: { 'initialData': Object, 'baseUrl': String, chartId: { type: String, default: 'line-chart' }, datasetIdKey: { type: String, default: 'label' }, width: { type: Number, default: 400 }, height: { type: Number, default: 400 }, cssClasses: { default: '', type: String }, styles: { type: Object, default: () => {} }, plugins: { type: Array, default: () => [] } }, data() { return { chartData: { labels: Object.keys(this.initialData), datasets: [ { label: 'Visitors', backgroundColor: '#f87979', data: Object.values(this.initialData) } ] }, chartOptions: { responsive: true, maintainAspectRatio: false, scales: { y: { ticks: { precision: 0 } } } } } }, mounted() { }, methods: { }, } </script>
Conclusion
It was quite fun and interesting project and after using it for about an month and analysing the results, it seem to be working fine. If you are interested in the code, or would like to try the package feel free to check it out on GitHub here: https://github.com/wdev-rs/laravel-analytics
Laravel News Links
Python Converting List of Strings to * [Ultimate Guide]
https://s.w.org/images/core/emoji/14.0.0/72×72/1f447.png
Since I frequently handle textual data with Python , I’ve encountered the challenge of converting lists of strings into different data types time and again. This article, originally penned for my own reference, decisively tackles this issue and might just prove useful for you too!
Let’s get started!
Python Convert List of Strings to Ints

This section is for you if you have a list of strings representing numbers and want to convert them to integers.
The first approach is using a for
loop to iterate through the list and convert each string to an integer using the int()
function.
Here’s a code snippet to help you understand:
string_list = ['1', '2', '3'] int_list = [] for item in string_list: int_list.append(int(item)) print(int_list) # Output: [1, 2, 3]
Another popular method is using list comprehension. It’s a more concise way of achieving the same result as the for
loop method.
Here’s an example:
string_list = ['1', '2', '3'] int_list = [int(item) for item in string_list] print(int_list) # Output: [1, 2, 3]
You can also use the built-in map()
function, which applies a specified function (in this case, int()
) to each item in the input list. Just make sure to convert the result back to a list using list()
.
Take a look at this example:
string_list = ['1', '2', '3'] int_list = list(map(int, string_list)) print(int_list) # Output: [1, 2, 3]
For a full guide on the matter, check out our blog tutorial:
Recommended: How to Convert a String List to an Integer List in Python
Python Convert List Of Strings To Floats

If you want to convert a list of strings to floats in Python, you’ve come to the right place. Next, let’s explore a few different ways you can achieve this.
First, one simple and Pythonic way to convert a list of strings to a list of floats is by using list comprehension.
Here’s how you can do it:
strings = ["1.2", "2.3", "3.4"] floats = [float(x) for x in strings]
In this example, the list comprehension iterates over each element in the strings
list, converting each element to a float using the built-in float()
function.
Another approach is to use the map()
function along with float()
to achieve the same result:
strings = ["1.2", "2.3", "3.4"] floats = list(map(float, strings))
The map()
function applies the float()
function to each element in the strings
list, and then we convert the result back to a list using the list()
function.
If your strings contain decimal separators other than the dot (.
), like a comma (,
), you need to replace them first before converting to floats:
strings = ["1,2", "2,3", "3,4"] floats = [float(x.replace(',', '.')) for x in strings]
This will ensure that the values are correctly converted to float numbers.
Recommended: How to Convert a String List to a Float List in Python
Python Convert List Of Strings To String
You might need to convert a list of strings into a single string in Python. It’s quite simple! You can use the
join()
method to combine the elements of your list.
Here’s a quick example:
string_list = ['hello', 'world'] result = ''.join(string_list) # Output: 'helloworld'
You might want to separate the elements with a specific character or pattern, like spaces or commas. Just modify the string used in the join()
method:
result_with_spaces = ' '.join(string_list) # Output: 'hello world' result_with_commas = ', '.join(string_list) # Output: 'hello, world'
If your list contains non-string elements such as integers or floats, you’ll need to convert them to strings first using a list comprehension or a map()
function:
integer_list = [1, 2, 3] # Using list comprehension str_list = [str(x) for x in integer_list] result = ','.join(str_list) # Output: '1,2,3' # Using map function str_list = map(str, integer_list) result = ','.join(str_list) # Output: '1,2,3'
Play around with different separators and methods to find the best suits your needs.
Python Convert List Of Strings To One String

Are you looking for a simple way to convert a list of strings to a single string in Python?
The easiest method to combine a list of strings into one string uses the join()
method. Just pass the list of strings as an argument to join()
, and it’ll do the magic for you.
Here’s an example:
list_of_strings = ["John", "Charles", "Smith"] combined_string = " ".join(list_of_strings) print(combined_string)
Output:
John Charles Smith
You can also change the separator by modifying the string before the join()
call. Now let’s say your list has a mix of data types, like integers and strings. No problem! Use the map()
function along with join()
to handle this situation:
list_of_strings = ["John", 42, "Smith"] combined_string = " ".join(map(str, list_of_strings)) print(combined_string)
Output:
John 42 Smith
In this case, the map()
function converts every element in the list to a string before joining them.
Another solution is using the str.format()
method to merge the list elements. This is especially handy when you want to follow a specific template.
For example:
list_of_strings = ["John", "Charles", "Smith"] result = " {} {} {}".format(*list_of_strings) print(result)
Output:
John Charles Smith
And that’s it! Now you know multiple ways to convert a list of strings into one string in Python.
Python Convert List of Strings to Comma Separated String
So you’d like to convert a list of strings to a comma-separated string using Python.
Here’s a simple solution that uses the join()
function:
string_list = ['apple', 'banana', 'cherry'] comma_separated_string = ','.join(string_list) print(comma_separated_string)
This code would output:
apple,banana,cherry
Using the join()
function is a fantastic and efficient way to concatenate strings in a list, adding your desired delimiter (in this case, a comma) between every element .
In case your list doesn’t only contain strings, don’t sweat! You can still convert it to a comma-separated string, even if it includes integers or other types. Just use list comprehension along with the str()
function:
mixed_list = ['apple', 42, 'cherry'] comma_separated_string = ','.join(str(item) for item in mixed_list) print(comma_separated_string)
And your output would look like:
apple,42,cherry
Now you have a versatile method to handle lists containing different types of elements

Remember, if your list includes strings containing commas, you might want to choose a different delimiter or use quotes to better differentiate between items.
For example:
list_with_commas = ['apple,green', 'banana,yellow', 'cherry,red'] comma_separated_string = '"{}"'.format('", "'.join(list_with_commas)) print(comma_separated_string)
Here’s the output you’d get:
"apple,green", "banana,yellow", "cherry,red"
With these tips and examples, you should be able to easily convert a list of strings (or mixed data types) to comma-separated strings in Python .
Python Convert List Of Strings To Lowercase
Let’s dive into converting a list of strings to lowercase in Python. In this section, you’ll learn three handy methods to achieve this. Don’t worry, they’re easy!
Solution: List Comprehension
Firstly, you can use list comprehension to create a list with all lowercase strings. This is a concise and efficient way to achieve your goal.
Here’s an example:
original_list = ["Hello", "WORLD", "PyThon"] lowercase_list = [item.lower() for item in original_list] print(lowercase_list) # Output: ['hello', 'world', 'python']
With this approach, the lower()
method is applied to each item in the list, creating a new list with lowercase strings.
Solution: map()
Function
Another way to convert a list of strings to lowercase is by using the map()
function. This function applies a given function (in our case, str.lower()
) to each item in a list.
Here’s an example:
original_list = ["Hello", "WORLD", "PyThon"] lowercase_list = list(map(str.lower, original_list)) print(lowercase_list) # Output: ['hello', 'world', 'python']
Remember to wrap the map()
function with the list()
function to get your desired output.
Solution: For Loop
Lastly, you can use a simple for
loop. This approach might be more familiar and readable to some, but it’s typically less efficient than the other methods mentioned.
Here’s an example:
original_list = ["Hello", "WORLD", "PyThon"] lowercase_list = [] for item in original_list: lowercase_list.append(item.lower()) print(lowercase_list) # Output: ['hello', 'world', 'python']
I have written a complete guide on this on the Finxter blog. Check it out!
Recommended: Python Convert String List to Lowercase
Python Convert List of Strings to Datetime

In this section, we’ll guide you through converting a list of strings to datetime
objects in Python. It’s a common task when working with date-related data, and can be quite easy to achieve with the right tools!
So, let’s say you have a list of strings representing dates, and you want to convert this into a list of datetime
objects. First, you’ll need to import the datetime
module to access the essential functions.
from datetime import datetime
Next, you can use the strptime()
function from the datetime
module to convert each string in your list to a datetime object. To do this, simply iterate over the list of strings and apply the strptime
function with the appropriate date format.
For example, if your list contained dates in the "YYYY-MM-DD"
format, your code would look like this:
date_strings_list = ["2023-05-01", "2023-05-02", "2023-05-03"] date_format = "%Y-%m-%d" datetime_list = [datetime.strptime(date_string, date_format) for date_string in date_strings_list]
By using list comprehension, you’ve efficiently transformed your list of strings into a list of datetime
objects!
Keep in mind that you’ll need to adjust the date_format
variable according to the format of the dates in your list of strings. Here are some common date format codes you might need:
%Y
: Year with century, as a decimal number (e.g., 2023)%m
: Month as a zero-padded decimal number (e.g., 05)%d
: Day of the month as a zero-padded decimal number (e.g., 01)%H
: Hour (24-hour clock) as a zero-padded decimal number (e.g., 17)%M
: Minute as a zero-padded decimal number (e.g., 43)%S
: Second as a zero-padded decimal number (e.g., 08)
Python Convert List Of Strings To Bytes
So you want to convert a list of strings to bytes in Python? No worries, I’ve got your back. This brief section will guide you through the process.
First things first, serialize your list of strings as a JSON string, and then convert it to bytes. You can easily do this using Python’s built-in json
module.
Here’s a quick example:
import json your_list = ['hello', 'world'] list_str = json.dumps(your_list) list_bytes = list_str.encode('utf-8')
Now, list_bytes
is the byte representation of your original list.
But hey, what if you want to get back the original list from those bytes? Simple! Just do the reverse:
reconstructed_list = json.loads(list_bytes.decode('utf-8'))
And voilà! You’ve successfully converted a list of strings to bytes and back again in Python.
Remember that this method works well for lists containing strings. If your list includes other data types, you may need to convert them to strings first.
Python Convert List of Strings to Dictionary

Next, you’ll learn how to convert a list of strings to a dictionary. This can come in handy when you want to extract meaningful data from a list of key-value pairs represented as strings.
To get started, let’s say you have a list of strings that look like this:
data_list = ["Name: John", "Age: 30", "City: New York"]
You can convert this list into a dictionary using a simple loop and the split()
method.
Here’s the recipe:
data_dict = {} for item in data_list: key, value = item.split(": ") data_dict[key] = value print(data_dict) # Output: {"Name": "John", "Age": "30", "City": "New York"}
Sweet, you just converted your list to a dictionary! But, what if you want to make it more concise? Python offers an elegant solution with dictionary comprehension.
Check this out:
data_dict = {item.split(": ")[0]: item.split(": ")[1] for item in data_list} print(data_dict) # Output: {"Name": "John", "Age": "30", "City": "New York"}
With just one line of code, you achieved the same result. High five!
When dealing with more complex lists that contain strings in various formats or nested structures, it’s essential to use additional tools like the json.loads()
method or the ast.literal_eval()
function. But for simple cases like the example above, the loop and dictionary comprehension should be more than enough.
Python Convert List Of Strings To Bytes-Like Object
How to convert a list of strings into a bytes-like object in Python? It’s quite simple and can be done easily using the
json
library and the utf-8
encoding.
Firstly, let’s tackle encoding your list of strings as a JSON string . You can use the
json.dumps()
function to achieve this.
Here’s an example:
import json your_list = ['hello', 'world'] json_string = json.dumps(your_list)
Now that you have the JSON string, you can convert it to a bytes-like object using the encode()
method of the string .
Simply specify the encoding you’d like to use, which in this case is 'utf-8'
:
bytes_object = json_string.encode('utf-8')
And that’s it! Your list of strings has been successfully transformed into a bytes-like object. To recap, here’s the complete code snippet:
import json your_list = ['hello', 'world'] json_string = json.dumps(your_list) bytes_object = json_string.encode('utf-8')
If you ever need to decode the bytes-like object back into a list of strings, just use the decode()
method followed by the json.loads()
function like so:
decoded_string = bytes_object.decode('utf-8') original_list = json.loads(decoded_string)
Python Convert List Of Strings To Array

Converting a list of strings to an array in Python is a piece of cake .
One simple approach is using the NumPy library, which offers powerful tools for working with arrays. To start, make sure you have NumPy installed. Afterward, you can create an array using the numpy.array()
function.
Like so:
import numpy as np string_list = ['apple', 'banana', 'cherry'] string_array = np.array(string_list)
Now your list is enjoying its new life as an array!
But sometimes, you may need to convert a list of strings into a specific data structure, like a NumPy character array. For this purpose, numpy.char.array()
comes to the rescue:
char_array = np.char.array(string_list)
Now you have a character array! Easy as pie, right?
If you want to explore more options, check out the built-in split()
method that lets you convert a string into a list, and subsequently into an array. This method is especially handy when you need to split a string based on a separator or a regular expression.
Python Convert List Of Strings To JSON
You’ve probably encountered a situation where you need to convert a list of strings to JSON format in Python. Don’t worry! We’ve got you covered. In this section, we’ll discuss a simple and efficient method to convert a list of strings to JSON using the json
module in Python.
First things first, let’s import the necessary module:
import json
Now that you’ve imported the json
module, you can use the json.dumps()
function to convert your list of strings to a JSON string.
Here’s an example:
string_list = ["apple", "banana", "cherry"] json_string = json.dumps(string_list) print(json_string)
This will output the following JSON string:
["apple", "banana", "cherry"]
Great job! You’ve successfully converted a list of strings to JSON. But what if your list contains strings that are already in JSON format?
In this case, you can use the json.loads()
function:
string_list = ['{"name": "apple", "color": "red"}', '{"name": "banana", "color": "yellow"}'] json_list = [json.loads(string) for string in string_list] print(json_list)
The output will be:
[{"name": "apple", "color": "red"}, {"name": "banana", "color": "yellow"}]
And that’s it! Now you know how to convert a list of strings to JSON in Python, whether it’s a simple list of strings or a list of strings already in JSON format.
Python Convert List Of Strings To Numpy Array

Are you looking to convert a list of strings to a numpy array in Python? Next, we will briefly discuss how to achieve this using NumPy.
First things first, you need to import numpy
. If you don’t have it installed, simply run pip install numpy
in your terminal or command prompt.
Once you’ve done that, you can import numpy
in your Python script as follows:
import numpy as np
Now that numpy
is imported, let’s say you have a list of strings with numbers that you want to convert to a numpy array, like this:
A = ['33.33', '33.33', '33.33', '33.37']
To convert this list of strings into a NumPy array, you can use a simple list comprehension to first convert the strings to floats and then use the numpy array()
function to create the numpy array:
floats = [float(e) for e in A] array_A = np.array(floats)
Congratulations! You’ve successfully converted your list of strings to a numpy array! Now that you have your numpy array, you can perform various operations on it. Some common operations include:
- Finding the mean, min, and max:
mean, min, max = np.mean(array_A), np.min(array_A), np.max(array_A)
- Reshaping the array:
reshaped_array = array_A.reshape(2, 2)
- Performing element-wise operations (e.g., adding):
array_B = np.array([1.0, 2.0, 3.0, 4.0]) result = array_A + array_B
Now you know how to convert a list of strings to a numpy array and perform various operations on it.
Python Convert List of Strings to Numbers
To convert a list of strings to numbers in Python, Python’s map
function can be your best friend. It applies a given function to each item in an iterable. To convert a list of strings into a list of numbers, you can use map
with either the int
or float
function.
Here’s an example:
string_list = ["1", "2", "3", "4", "5"] numbers_int = list(map(int, string_list)) numbers_float = list(map(float, string_list))
Alternatively, using list comprehension is another great approach. Just loop through your list of strings and convert each element accordingly.
Here’s what it looks like:
numbers_int = [int(x) for x in string_list] numbers_float = [float(x) for x in string_list]
Maybe you’re working with a list that contains a mix of strings representing integers and floats. In that case, you can implement a conditional list comprehension like this:
mixed_list = ["1", "2.5", "3", "4.2", "5"] numbers_mixed = [int(x) if "." not in x else float(x) for x in mixed_list]
And that’s it! Now you know how to convert a list of strings to a list of numbers using Python, using different techniques like the map
function and list comprehension.
Python Convert List Of Strings To Array Of Floats

Starting out, you might have a list of strings containing numbers, like
['1.2', '3.4', '5.6']
, and you want to convert these strings to an array of floats in Python.
Here’s how you can achieve this seamlessly:
Using List Comprehension
List comprehension is a concise way to create lists in Python. To convert the list of strings to a list of floats, you can use the following code:
list_of_strings = ['1.2', '3.4', '5.6'] list_of_floats = [float(x) for x in list_of_strings]
This will give you a new list
list_of_floats
containing [1.2, 3.4, 5.6]
.
Using numpy. 
If you have numpy installed or are working with larger arrays, you might want to convert the list of strings to a numpy array of floats.
Here’s how you can do that:
import numpy as np list_of_strings = ['1.2', '3.4', '5.6'] numpy_array = np.array(list_of_strings, dtype=float)
Now you have a numpy array of floats: array([1.2, 3.4, 5.6])
.
Converting Nested Lists
If you’re working with a nested list of strings representing numbers, like:
nested_list_of_strings = [['1.2', '3.4'], ['5.6', '7.8']]
You can use the following list comprehension:
nested_list_of_floats = [[float(x) for x in inner] for inner in nested_list_of_strings]
This will result in a nested list of floats like [[1.2, 3.4], [5.6, 7.8]]
.
Pheww! Hope this article helped you solve your conversion problems.
Free Cheat Sheets!
If you want to keep learning Python and improving your skills, feel free to check out our Python cheat sheets (100% free):
Be on the Right Side of Change
How “Invisible” Metal Cuts Are Made
https://theawesomer.com/photos/2023/04/thin_line_metal_cuts_t.jpg
Metal objects like the Metmo Cube are fascinating because they feature parts that are so precisely cut that you can’t see where one piece begins and the other one ends. Science educator Steve Mould explains wire EDM machining, which enables the creation of such incredibly tight-fitting objects.
The Awesomer
Save Money in AWS RDS: Don’t Trust the Defaults
https://www.percona.com/blog/wp-content/uploads/2023/03/lucas.speyer_an_icon_of_an_electronic_cloud_97fa4765-ec96-44fb-b23e-dbe3512b9710-150×150.png
Default settings can help you get started quickly – but they can also cost you performance and a higher cloud bill at the end of the month. Want to save money on your AWS RDS bill? I’ll show you some MySQL settings to tune to get better performance, and cost savings, with AWS RDS.
Recently I was engaged in a MySQL Performance Audit for a customer to help troubleshoot performance issues that led to downtime during periods of high traffic on their AWS RDS MySQL instances. During heavy loads, they would see messages about their InnoDB settings in the error logs:
[Note] InnoDB: page_cleaner: 1000ms intended loop took 4460ms. The settings might not be optimal. (flushed=140, during the time.)
This message is normally a side effect of a storage subsystem that is not capable of keeping up with the number of writes (e.g., IOPs) required by MySQL. This is “Hey MySQL, try to write less. I can’t keep up,” which is a common situation when innodb_io_capacity_max is set too high.
After some time of receiving these messages, eventually, they hit performance issues to the point that the server becomes unresponsive for a few minutes. After that, things went back to normal.
Let’s look at the problem and try to gather some context information.
Investigating AWS RDS performance issues
We had a db.m5.8xlarge instance type (32vCPU – 128GB of RAM) with a gp2 storage of 5TB, which should provide up to 10000 IOPS (this is the maximum capacity allowed by gp2), running MySQL 5.7. This is a pretty decent setup, and I don’t see many customers needing to write this many sustained IOPS.
The innodb_io_capacity_max parameter was set to 2000, so the hardware should be able to deliver that many IOPS without major issues. However, gp2 suffers from a tricky way of calculating credits and usage that may drive erroneous conclusions about the real capacity of the storage. Reviewing the CloudWatch graphics, we only had roughly 8-9k IOPS (reads and writes) used during spikes.
While the IO utilization was quite high, there should be some room to get more IOPS, but we were still seeing errors. What caught my attention was the self-healing condition shown by MySQL after a few minutes.
Normally, the common solution that was actually discussed during our kick-off call was, “Well, there is always the chance to move to Provisioned IOPS, but that is quite expensive.” Yes, this is true, io2 volumes are expensive, and honestly, I think they should be used only where really high IO capacity at expected latencies is required, and this didn’t seem to be the case.
Otherwise, most of the environments can adapt to gp2/gp3 volumes; for that matter, you need to provision a big enough volume and get enough IOPS.
Finding the “smoking gun” with pt-mysql-summary
Not too long ago, my colleague Yves Trudeau and I worked on a series of posts debating how to configure an instance for write-intensive workloads. A quick look at the pt-mysql-summary output shows something really interesting when approaching the issue out of the busy period of load:
# InnoDB ##################################################### Version | 5.7.38 Buffer Pool Size | 93.0G Buffer Pool Fill | 100% Buffer Pool Dirty | 1% File Per Table | ON Page Size | 16k Log File Size | 2 * 128.0M = 256.0M Log Buffer Size | 8M Flush Method | O_DIRECT Flush Log At Commit | 1 XA Support | ON Checksums | ON Doublewrite | ON R/W I/O Threads | 4 4 I/O Capacity | 200 Thread Concurrency | 0 Concurrency Tickets | 5000 Commit Concurrency | 0 Txn Isolation Level | REPEATABLE-READ Adaptive Flushing | ON Adaptive Checkpoint | Checkpoint Age | 78M InnoDB Queue | 0 queries inside InnoDB, 0 queries in queue
Wait, what? 256M of redo logs and a Checkpoint Age of only 78M? That is quite conservative, considering a 93GB buffer pool size. I guess we should assume bigger redo logs for such a big buffer pool. Bingo! We have a smoking gun here.
Additionally, full ACID features were enabled, this is innodb_flush_log_at_trx_commit=1 and sync_binlog=1, which adds a lot of write overhead to every operation because, during the commit stage, everything is flushed to disk (or to gp2 in this case).
Considering a spike of load running a lot of writing queries, hitting the max checkpoint age in this setup is a very likely situation.
Basically, MySQL will perform flushing operations at a certain rate depending on several factors. This rate is normally close to innodb_io_capacity (200 by default); if the number of writes starts to approach to max checkpoint age, then the adaptive flushing algorithm will start to push up to innodb_io_capacity_max (2000 by default) to try to keep the free space in the redo logs far from the max checkpoint age limit.
If we keep pushing, we can eventually reach the max checkpoint age, which will drive the system to the synchronous state, meaning that a sort of furious flushing operations will happen beyond innodb_io_capacity_max and all writing operations will be paused (freezing writes) until there is free room in the redo logs to keep writing.
This was exactly what was happening on this server. We calculated roughly how many writes were being performed per hour, and then we recommended increasing the size of redo log files to 2x2GB each (4GB total). In practical terms, it was 3.7G due to some rounding that RDS does, so we got:
# InnoDB ##################################################### Version | 5.7.38 Buffer Pool Size | 92.0G Buffer Pool Fill | 100% Buffer Pool Dirty | 2% File Per Table | ON Page Size | 16k Log File Size | 2 * 1.9G = 3.7G Log Buffer Size | 8M Flush Method | O_DIRECT
Then we also increased the innodb_io_capacity_max to 4000, so we let the adaptive flushing algorithm increase writes with some more room. Results in CloudWatch show we were right:
The reduction during the last couple of weeks is more than 50% of IOPS, which is pretty decent now, and we haven’t changed the hardware at all. Actually, it was possible to reduce the storage size to 3TB and avoid moving to expensive io2 (provisioned IOPS) storage.
Conclusions
RDS normally works very well out of the box; most of the configurations are properly set for the type of instance provisioned. Still, I’ve found that the RDS default size of the redo logs being this small is silly, and people using a fully managed solution would expect not to worry about some common tuning.
MySQL 8.0 implemented innodb_dedicated_server that auto sizes innodb_log_file_size and innodb_log_files_in_group (now replaced by innodb_redo_log_capacity) as a function of InnoDB Buffer Pool size using a pretty simple, but effective, algorithm, and I guess it shouldn’t be hard for AWS team to implement it. We’ve done some research, and it seems RDS is not pushing this login into the 8.0 versions, which sounds strange to have such a default for innodb_redo_log_capacity
In the meantime, checking how RDS MySQL is configured with default parameters is something we all should review to avoid the typical “throwing more hardware solution” – and, by extension, spending more money.
Percona Consultants have decades of experience solving complex database performance issues and design challenges. They’ll work with you to understand your goals and objectives and provide the best, unbiased solutions for your database environment.
Learn more about Percona Consulting
A personalized Percona Database Performance Audit will help uncover potential performance killers in your current configuration.
Percona Database Performance Blog
4 Things To Do When Laravel App Goes Live [VIDEO]
http://img.youtube.com/vi/UyopFbFRug8/0.jpgToday I’m answering a question from one of you: what to do to ensure the already-launched Laravel application is more stable on the live server.Laravel News Links