Laravel package for using Microsoft mail, OneDrive, Teams, Excel, Calendars and Contacts

https://opengraph.githubassets.com/64df237792f136c48c459cc2b4214519830c794a15e07213670c1933719513b1/LLoadout/microsoftgraph

Latest Version on Packagist
Total Downloads

Laravel package for using Microsoft mail, OneDrive, Teams, Excel, Calendars and Contacts

This package makes a wrapper around the Microsoft Graph API.

  1. It provides a Mail driver for Microsoft mail.
  2. It provides a storage driver for OneDrive.
  3. It provides functionality to interact with Microsoft Teams.
  4. It provides the possibility to work with Excel, making it possible to write and read Excel files.
  5. It allows you to manage calendar events.
  6. It allows you to manage contacts.
  7. It allows you to read and handle mail.

You need to register an app in the Microsoft Azure Portal to use this package. Follow the steps in the Microsoft docs:
https://docs.microsoft.com/en-us/graph/auth-register-app-v2

You can install the package via composer:

composer require lloadout/microsoftgraph

Add this to your .env file and fill it with the values you specified in Microsoft Azure Portal app registration.
If you created a multi-tenant app in Azure AD than you don’t put your tentant id into the MS_TENANT_ID variable but you set it to common.

MS_TENANT_ID=
MS_CLIENT_ID=
MS_CLIENT_SECRET=
MS_GRAPH_API_VERSION=v1.0
MS_REDIRECT_URL=https://your-url.com/microsoft/callback

The package uses OAuth and provides two routes

The first redirects you to the consent screen of Microsoft

https://your-url.com/microsoft/connect

The second is the callback url you need to specify in Microsoft Azure Portal app registration as redirect uri

https://your-url.com/microsoft/callback

The callback will fire an MicrosoftGraphCallbackReceived event, you have to listen for this event in your EventServiceProvider and store the accessData to a session variable microsoftgraph-access-data.
You can add your token store logic in a listener for this event.

public function boot()
{
    Event::listen(function (MicrosoftGraphCallbackReceived $event) {
        session()->put('microsoftgraph-access-data', $event->accessData); 
    });
}

The package will search for a session variable name microsoftgraph-access-data for establishing the connection. So
please provide this variable with your accessData as value when logging in.
For example: On login, you get your accesData from the database and store it into the session
variable microsoftgraph-access-data.

You have to provide this API permissions: Mail.send

Set the environment variable MAIL_MAILER in your .env file

MAIL_MAILER=microsoftgraph

note: make sure your from address is the address you gave the consent to

Mail::send(new YourMailable());

Mail::raw('The body of my first test message', function($message) {
    $message->to('john@doe.com', 'John Doe')->subject('A mail send via lloadout/microsoftgraph');
});

Reading and handling mail

You have to provide this API permissions: Mail.Read, Mail.ReadWrite, Mail.ReadBasic

    getMailFolders(): array|GraphResponse|mixed
    getSubFolders(id): array|GraphResponse|mixed
    getMailMessagesFromFolder([folder: string = 'inbox'], [isRead: true = true], [skip: int = 0], [limit: int = 20]): array
    updateMessage(id, data): array|GraphResponse|mixed
    moveMessage(id, destinationId): array|GraphResponse|mixed
    getMessage(id): array|GraphResponse|mixed
    getMessageAttachements(id): array|GraphResponse|mixed
    $mail = app(Mail::class);

    collect($mail->getMailFolders())->each(function($folder){
        echo $folder['displayName']."<br />";
    });

    //get all unread messages from inbox
    collect($mail->getMailMessagesFromFolder('inbox', isRead: false))->each(function($message) use ($mail){
        echo $message['subject']."<br />";
    });
        

You have to provide this API permissions: Files.ReadWrite.all

add the onedrive root to your .env file:

MS_ONEDRIVE_ROOT="me/drive/root"

All methods from the Laravel Storage facade are available. https://laravel.com/docs/8.x/filesystem#configuration

The package created a disk called onedrive. This means that you can use all the methods as described in the Laravel docs: https://laravel.com/docs/8.x/filesystem#configuration

$disk = Storage::disk('onedrive');
#create a dir
$disk->makeDirectory('Test folder');
#storing files
$disk->put('Test folder/file1.txt','Content of file 1');
$disk->put('Test folder/file2.txt','Content of file 2');
#getting files
Storage::disk('onedrive')->get('Test folder/file1.txt');

You have to provide this API permissions: Chat.ReadWrite

    getJoinedTeams(): array|GraphResponse|mixed
    getChannels(team): array|GraphResponse|mixed
    getChats(): array|GraphResponse|mixed
    getChat(id): array|GraphResponse|mixed
    getMembersInChat(chat): array|GraphResponse|mixed
    send(teamOrChat, message): array|GraphResponse|mixed

First instantiate the Teams class

$teamsClass = new Teams();

Get all the teams you are a member of ( additional permissions needed: Group.Read.All )

$joinedTeams = $teamsClass->getJoinedTeams();

Get alle the channels for a team ( additional permissions needed: Group.Read.All )

$channels = $teamsClass->getChannels($team);

Get all the chats for a user ( additional permissions needed: Chat.Read.All )

$chats = $teamsClass->getChats(); 

Get a chat by a given id ( additional permissions needed: Chat.Read.All )

$chats = $teamsClass->getChat('your-chat-id'); 

Get all the members in a channel ( additional permissions needed: ChannelMessage.Read.All )

$members = $teamsClass->getMembersInChat($chat));

Send a message to a channel ( additional permissions needed: ChannelMessage.Send )

$teamsClass->send($teamOrChat,'Hello world!');

You have to provide this API permissions: Files.ReadWrite.all

    loadFile(file): void
    loadFileById(fileId): void
    setCellValues(cellRange, values: array): void
    getCellValues(cellRange): array
    recalculate(): void
    createSession(fileId): string

First instantiate the Excel class

$excelClass = new Excel();

Load a file from OneDrive

$excelClass->loadFile('Test folder/file1.xlsx');

Load a file by its id

$excelClass->loadFileById($fileId);

Set cell values of a range

$values = ['B1' => null, 'B2' => '01.01.23', 'B3' => 3, 'B4' => '250', 'B5' => '120', 'B6' => '30 cm', 'B7' => null, 'B8' => null, 'B9' => null, 'B10' => null, 'B11' => null, 'B12' => 2];
$excelClass->setCellValues('B1:B12', $values);
$excelClass->getCellValues('H1:H20');

You have to provide this API permissions: Calendars.ReadWrite

    getCalendars(): array
    getCalendarEvents(calendar: Calendar): array
    saveEventToCalendar(calendar: Calendar, event: Event): GraphResponse|mixed
    makeEvent(starttime: string, endtime: string, timezone: string, subject: string, body: string, [attendees: array = [...]], [isOnlineMeeting: bool = false]): Event

First instantiate the Calendar class

$calendarClass = new Calendar();

Get all the calendars

$calendars = $calendarClass->getCalendars();

Get all the events for a calendar

$events = $calendarClass->getCalendarEvents($calendar);

Save an event to a calendar, the event object is a MicrosoftGraphEvent object
We made a helper function to create an event
object Calendar::makeEvent(string $starttime, string $endtime, string $timezone, string $subject, string $body, array $attendees = [], bool $isOnlineMeeting = false)

$calendarClass->saveEvent($calendar, $event);

You have to provide this API permissions: Contacts.ReadWrite

First instantiate the Contacts class

$contactsClass = new Contacts();

Get all the contacts

$contacts = $contactsClass->getContacts();

Please see CHANGELOG for more information on what has changed recently.

Please see CONTRIBUTING for details.

Please review our security policy on how to report security vulnerabilities.

The MIT License (MIT). Please see License File for more information.

Laravel News Links

Floor 496

https://theawesomer.com/photos/2024/09/floor496_t.jpgEach visit to Floor796 is like exploring world&#8217;s largest Where&#8217;s Waldo image. The website consists of a constantly-expanding isometric animation of life on the 796th floor of a huge space station. It&#8217;s packed with pop culture references, and the entire animation has been drawn by a single artist using a special browser-based editing tool. (Thanks, [&#8230;]The Awesomer

MySQL Inside: Using the PS error_log table for a quick peak!

Just thought I’d share a script I use daily and helps me redirect my attention if needed.

This is but a mere pointer, guideline and starting point in any task. I just thought I’d share and hope someone else’s day becomes slightly easier thanks to some brief investigation and command tweaking.

Now the really handy thing here is that I only hard code the router01 node name, as I’m using that as a potential endpoint (thinking cloud, XaaS, etc…) where it could also be a VIP, LBR or similar. It’s the entry point so I can query the P_S table error_log so I can get different views and act accordingly.

For example:

  • First, give me the InnoDB Cluster ordered server list so I can take a step back from my usual pains and worries, and see the architecture view. And make me type “Y” or similar to move on. Here if there were any server missing, I’d see the summary right away so I don’t really need to worry about the error_log and a good ol’ CTRL-C is struck.
    • Now, give me a list of errors that appear in the log for the last:

30 minutes

4 hours

24 hours

    And then a summary & count, so I can prioritize my investigation time.

    Some explanations as to what’s being used:

    “–login-path” is how I have configured my env for added simplicity. Feel free to use user@host:port or whatever tickles the proverbial.

    “-A” when call mysqlsh means I won’t search any cached data.

    “–sqlc” means I’ll execute SQL.

    And at the end I’m curious as to since when the instance has been recording the error_log, so I add it in, for the hell of it.

    Sharing is good, hence the script:

    $ cat error_log.sh
    
    
    
    #!/bin/bash
    
    MYSQLUSER=icadmin
    MYROUTER1=router01
    
    MYSQLSH="mysqlsh --login-path=$MYSQLUSER -h$MYROUTER1 --table -A --sqlc -e "
    
    echo
    echo "Current time:" date
    echo
    echo "Remember: login-path needs to be defined."
    MYSQLUSER=icadmin
    MYSQLSH="mysqlsh --login-path=$MYSQLUSER -h$HOSTNAME --tabbed -A --sqlc -e "
    MYSQLLOGIN="mysqlsh --login-path=$MYSQLUSER -h"
    
    MYROUTER1=$MYSQLSH "select min(address) from mysql_innodb_cluster_metadata.routers" | grep -v address
    MYROUTER2=$MYSQLSH "select max(address) from mysql_innodb_cluster_metadata.routers" | grep -v address
    HOST1=$MYSQLSH "select MEMBER_HOST from performance_schema.replication_group_members where MEMBER_ROLE = 'PRIMARY';" | grep -v MEMBER
    HOST2=$MYSQLSH "select min(MEMBER_HOST) from performance_schema.replication_group_members where MEMBER_ROLE != 'PRIMARY';" | grep -v MEMBER
    HOST3=$MYSQLSH "select max(MEMBER_HOST) from performance_schema.replication_group_members where MEMBER_ROLE != 'PRIMARY';" | grep -v MEMBER
    echo
    echo "We have the following information: "
    echo " "$MYROUTER1" & "$MYROUTER2
    echo " "$HOST1" & "$HOST2" & "$HOST3
    echo " "$HOSTNAME" is the current host."
    echo
    read -p "Continue? (Y/N): " confirm && [[ $confirm == [yY] || $confirm == [yY][eE][sS] ]] || exit 1
    
    
    echo ----------------------------------------------------------------------------------------------------------------------------------------------------------------
    echo
    echo $HOST1
    MYSQLSH="mysqlsh --login-path=$MYSQLUSER -h$HOST1 --table -A --sqlc -e "
    
    echo "Errors in the last 30 mins:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 30 MINUTE)"
    echo
    echo "Errors in the last 4 hours:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 4 HOUR)"
    echo
    echo "Errors for the last 24 hours:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY)"
    echo
    echo "Summary of the latest errors for the last 24h in the error_log:"
    $MYSQLSH "select count(*) from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY)" $MYSQLSH "select ERROR_CODE, SUBSYSTEM, DATA, count() from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY) group by ERROR_CODE, SUBSYSTEM, DATA"
    echo
    echo "Summary of error log:"
    $MYSQLSH "select min(LOGGED) as Earliest, max(LOGGED) as Last, count(*) as Count from performance_schema.error_log "
    echo
    echo ----------------------------------------------------------------------------------------------------------------------------------------------------------------
    echo
    echo $HOST2
    MYSQLSH="mysqlsh --login-path=$MYSQLUSER -h$HOST2 --table -A --sqlc -e "
    echo
    echo "Current time:"date
    echo
    echo "Errors in the last 30 mins:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 30 MINUTE)"
    echo
    echo "Errors in the last 4 hours:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 4 HOUR)"
    echo
    echo "Errors for the last 24 hours:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY)"
    echo
    echo "Summary of the latest errors for the last 24h in the error_log:"
    $MYSQLSH "select count(*) from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY)" $MYSQLSH "select ERROR_CODE, SUBSYSTEM, DATA, count() from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY) group by ERROR_CODE, SUBSYSTEM, DATA"
    echo
    echo "Summary of error log:"
    $MYSQLSH "select min(LOGGED) as Earliest, max(LOGGED) as Last, count(*) as Count from performance_schema.error_log "
    echo
    
    echo ----------------------------------------------------------------------------------------------------------------------------------------------------------------
    echo
    echo $HOST3
    MYSQLSH="mysqlsh --login-path=$MYSQLUSER -h$HOST3 --table -A --sqlc -e "
    echo
    echo "Current time:"date
    echo
    echo "Errors in the last 30 mins:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 30 MINUTE)"
    echo
    echo "Errors in the last 4 hours:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 4 HOUR)"
    echo
    echo "Errors for the last 24 hours:"
    $MYSQLSH "select * from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY)"
    echo
    echo "Summary of the latest errors for the last 24h in the error_log:"
    $MYSQLSH "select count(*) from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY)" $MYSQLSH "select ERROR_CODE, SUBSYSTEM, DATA, count() from performance_schema.error_log where logged > (NOW() - INTERVAL 1 DAY) group by ERROR_CODE, SUBSYSTEM, DATA"
    echo
    echo "Summary of error log:"
    $MYSQLSH "select min(LOGGED) as Earliest, max(LOGGED) as Last, count(*) as Count from performance_schema.error_log "
    echo
    echo "End!"

    Planet MySQL

    Laravel ERD generator

    https://opengraph.githubassets.com/d5fcf2928c5bfc84c7e461d6ca2b9ba567fc369f37486ee3f043952f6437531f/recca0120/laravel-erd

    Latest Version on Packagist
    GitHub Tests Action Status
    Total Downloads

    Laravel ERD automatically generates Entity-Relationship Diagrams from your Laravel models and displays them
    using erd-editor.

    Here’s a sample of what you can expect, generated from migrations
    and models:

    View Live Demo

    erd-editor

    Lang Version
    PHP 7.4, 8.0, 8.1, 8.2, 8.3
    Laravel 8, 9, 10, 11

    Install the package via Composer:

    composer require recca0120/laravel-erd:^0.1 --dev

    Run the following command:

    Step 2: View the ERD

    Open the following URL in your browser:

    http://localhost/laravel-erd

    Exclude Tables and Save to a Different Filename

    Run the command:

    php artisan erd:generate --file=exclude-users.sql --exclude=users

    Open the URL:

    http://localhost/laravel-erd/exclude-users

    Install erd-go
    and graphviz-dot.js using:

    Generate the SVG file:

    php artisan generate --file=laravel-erd.svg

    View the SVG version:

    http://localhost/laravel-erd/laravel-erd.svg

    svg

    The SVG file can be found at storage/framework/cache/laravel-erd.

    Feel free to ask if you have any questions or need further assistance!

    Laravel News Links

    Backup Tables

    https://repository-images.githubusercontent.com/844615628/cd7a488d-4e57-454c-9694-f994c1962fc7

    Package cover

    Latest Version on Packagist
    Total Downloads
    GitHub Code Style Action Status
    GitHub Tests For Laravel Versions Action Status
    GitHub Tests For Databases Action Status

    Backup single or multiple database tables with ease.

    Note: if you want a full database backup with many features go for Spatie Laravel Backup.

    You can install the package via Composer:

    composer require watheqalshowaiter/backup-tables

    Use the BackupTables::generateBackup($tableToBackup) Facade anywhere in your application and it will
    generate $tableToBackup_backup_2024_08_22_17_40_01 table in the database with all the data and structure. Note that
    the datetime 2024_08_22_17_40_01 will be varied based on your datetime.

    use WatheqAlshowaiter\BackupTables\BackupTables; // import the facade
    
    class ChangeSomeData
    {
        public function handle()
        {
            BackupTables::generateBackup('users'); // will result: users_backup_2024_08_22_17_40_01
           
            // change some data.. 
        }
    }

    And More Customizations

    • You can use an array to backup more than one table
    BackupTables::generateBackup(['users', 'posts']); 
    // users_backup_2024_08_22_17_40_01
    // posts_backup_2024_08_22_17_40_01 
    • Or add Classes as parameters, It will backup their tables
    BackupTables::generateBackup(User::class); // users_backup_2024_08_22_17_40_01
    // or
    BackupTables::generateBackup([User::class, Post::class]); // users_backup_2024_08_22_17_40_01, posts_backup_2024_08_22_17_40_01 
     
    • You can customize the $dataTime format to whatever you want
    BackupTables::generateBackup('users', 'Y_d_m_H_i'); // users_backup_2024_22_08_17_40

    *Note: be aware if you customize the datetime to wide datetime the package will check the backup datetime file and
    will be skipped
    the exact same datetime, so most of the time the default will be fine
    For example: if you use this Y_d_m_H you can not generate the same backup in the same hour

    BackupTables::generateBackup('users', 'Y_d_m_H'); // can not generate the same backup in the same hour
    BackupTables::generateBackup('users', 'Y_d_m'); // can not generate the same backup in the same day

    Sometimes you want to backup some database tables before changing data for whatever reason, this package serves this
    need. I used it personally before adding foreign keys for tables that required the removal of unlinked fields for parent tables.
    You may find some situation where you play with table data or you’re afraid of missing data so you backup these tables
    beforehand.

    ✅ Supports Laravel versions: 11, 10, 9, 8, 7, and 6.

    ✅ Supports PHP versions: 8.2, 8.1, 8.0, and 7.4.

    ✅ Supports SQL databases: SQLite, MySQL/MariaDB, PostgreSQL, and SQL Server.

    ✅ Fully automated tested with PHPUnit.

    ✅ Full GitHub Action CI pipeline to format code and test against all Laravel and PHP versions.

    Please see CHANGELOG for more information on what has changed recently.

    If you have any ideas or suggestions to improve it or fix bugs, your contribution is welcome. I encourage you to look at todos which are the most important features that need to be added. If you have something different, submit an issue first to discuss or report a bug, then do a pull request.

    If you find any security vulnerabilities don’t hesitate to contact me at watheqalshowaiter[at]gmail[dot]com to fix
    them.

    And a special thanks to The King Creative for the logo ✨

    The MIT License (MIT). Please see License File for more information.

    Laravel News Links

    Pinkary is now fully open source

    https://picperf.io/https://laravelnews.s3.amazonaws.com/featured-images/pinkary-featured.png

    Pinkary is now fully open source

    Pest creator and Laravel core team member Nuno Maduro announced recently that Pinkary is now fully open source, and you can also find us on it at @laravelnews.

    Built with Laravel, Livewire, Tailwind, and more, Pinkary is an excellent example of a full Laravel application you can learn from and contribute to:

    Pinkary is already a thriving project, with over 400+ submitted pull requests, and is encouraging open-source contributions and giving you the installation instructions you need to go from project setup to pull request. Pinkary is already using Pest 3, and as the creator of Pest, the code has plenty of examples you can learn from to level up your testing skills.

    On the user side of this project, Pinkary is a landing page for all your links and a place where you can connect with like-minded people without the noise of other social media applications. It went from a new project to over a thousand users very quickly and is known for using a SQLite database for application data, sessions, queues, cache, etc.

    Start learning from the Pinkary source code today! Visit pinkary-project/pinkary.com on GitHub and join this exciting open-source project.


    The post Pinkary is now fully open source appeared first on Laravel News.

    Join the Laravel Newsletter to get all the latest
    Laravel articles like this directly in your inbox.

    Laravel News

    Mastering MySQL: Key Performance Metrics Every Developer Should Monitor

    https://static.tildacdn.net/tild3936-3139-4235-b730-636439336162/MySQL_variables_Temp.png

    The RED method is traditionally used for monitoring the performance of web applications and services but can also be applied to MySQL performance monitoring. Releem has found the framework to be equally valuable in monitoring MySQL performance metrics because the challenges faced by databases, in terms of performance and reliability, mirror those encountered by web applications.

    When applied to MySQL databases, the RED method breaks down into three critical areas of concern, each providing insights into your database’s operational health:

    • Query Rate (Rate) – This assesses the volume of queries or commands executed per second, offering a direct measure of the server’s workload. It’s instrumental in evaluating the database’s ability to handle concurrent operations and its responsiveness to user demands.
    • Error Rate (Errors) – Tracking the frequency of errors in queries sheds light on potential reliability issues within the database. A high error rate may indicate underlying problems with query syntax, database schema, or system constraints that are affecting the overall database integrity. The primary MySQL metric for monitoring rate is Aborted_clients.
    • Query Execution Duration (Duration) – The duration metric is a measure of the time it takes for queries to complete, from initiation to execution. This performance indicator assesses the efficiency of data retrieval and processing operations which have direct impacts on the user experience and system throughput.

    The health of these metrics gives you a solid understanding of how your database is performing and in turn, the experience your users are having. The RED method makes it easy to gauge what is wrong with your database and what needs to be fixed. For instance, should you find that queries are sluggishly executing, it might signal the need for tweaking indexes or optimizing the affected queries to boost efficiency.

    Laravel News Links

    How I Use a Free App to Build My Personal News Feed

    https://static1.makeuseofimages.com/wordpress/wp-content/uploads/2024/08/what_is_rss_and_what_can_you_do_with_it.jpg

    Key Takeaways

    • Inoreader is a free RSS feed reader app that consolidates articles from multiple sources for easy access.
    • Customize your feed with up to 150 sources and monitor it through a customizable dashboard.
    • Pro users gain access to advanced features such as rules, filters, article translation, and monitoring feeds, which enhance control.

    If you want to read up on your favorite online news site or any other source of information, you can either take the time to look up what you want on a specific website or go through the newsletters that clog your inbox. A more efficient way is to use an RSS (Really Simple Syndication) feed reader.

    RSS readers collect articles from multiple sources into one app, making it extremely easy to find what you need without fighting search algorithms and the general mess that the internet is. And the best part? You don’t need to pay for this feature—Inoreader lets you create a personalized news feed for free.

    What Is Inoreader?

    Inoreader is one of the best and most well-reviewed RSS feed-reading apps you can find online. If you don’t know what RSS is and what you can do with it, we’ve got you covered, but in a nutshell, RSS is a simple format that allows websites to share updates directly with you. RSS is a great alternative to newsletters and social media, as it puts you in control of what appears in your feed, free from the influence of algorithms.

    There are a lot of ways you can find popular content on the internet, but RSS readers like Inoreader allow you to build your own feeds. It’s a little bit of work setting it up, but once you’re done, you’ll have all your news sources coming up in one feed and in one app, exactly how you prefer.

    Inoreader’s free version is rather limited in terms of the features you get, but you can still get your news feed and a personalized dashboard up and running in no time. However, more advanced features like filters, automation, and translation are reserved for paying users.

    1. Setting Up a Feed

    The first thing you’ll see when you log into Inoreader is your dashboard with three options to get you started. You can search for your preferred sources like news websites, explore Inoreader’s featured feed collections, or import feeds from another RSS reader.

    Inoreader’s featured collections are a good starting point for beginners, but to create a truly personalized feed, it’s best to search for the websites and sources you frequently read. This approach helps you avoid clutter and focus on what matters most to you.

    Inoreader offers extensive control over what you can add to your feed. You can follow websites, Facebook pages, Twitter accounts, Google News, Reddit, Telegram channels, and even other Inoreader users. You can also track keywords, brands, names, subreddits, or specific phrases.

    While Facebook Pages are restricted to Pro users, all other sources are accessible on the free tier.

    The free version of Inoreader lets you select up to 150 feeds (or sources), so you can look up all your sources and add them to Inoreader. As you add sources, Inoreader will suggest similar ones to expand your feed further.

    2. Monitoring Your Feed

    Once you’ve added your desired feeds, you’ll likely return to the app frequently to stay updated. This is where Inoreader’s dashboard comes in handy.

    When you first load the reader, it defaults to a basic yet functional dashboard. However, if you want more control, you can create a totally custom dashboard. Just click the Create custom dashboard in the top bar, select the widgets you want, and you’re off to the races.

    These dashboard widgets show everything from the latest articles coming in across your feeds to the total number of unread articles you have, article reading statistics, trending articles on Inoreader, and even recommended sources.

    There’s not a lot you can customize here, though. It would’ve been nice if Inoreader had let me resize the widgets so I could truly make the dashboard look how I wanted it, but the fact that I can deck it out with the information I want to see is valuable.

    Once you’ve set up everything to your liking, chances are you’ll find what you need on the dashboard itself, without having to scroll through your sources. And if you do need to take a deeper look, you can create folders to group similar feeds together for another layer of organization.

    Another powerful feature is Monitoring Feeds. This feature allows you to search for a specific term and create a feed for it. As articles containing the search terms appear, they’ll populate this monitoring feed, ensuring you catch what you’re looking for as soon as it arrives. Unfortunately, this feature is behind a paywall and is only accessible to Pro subscribers.

    Another feature you can use to spot words and phrases in articles quickly is Highlighters. Add your term, select a color, and you’ll see the word or phrase highlighted in that color as you read articles in your feed. This is by far one of the best features I’ve come across in an RSS feed app, and it makes it incredibly easy to find what you’re looking for without reading entire articles.

    3. Using Filters and Automation

    Two of the coolest Inoreader features, Rules and Filters, are only available to Pro users, which is a bit of a bummer. As the name suggests, rules allow you to act based on article properties. For example, you can send push notifications if an article from a specific site, author, or keyword gets published.

    Filters enable you to exclude articles from your feed or connected apps based on keywords in their title or body. Since filtered feeds only cover the last 30 days, filters help ensure you see only the content that interests you, while irrelevant content is filtered out.

    4. Translating Articles

    Inoreader also offers translation features for those reading articles in different languages. This feature, available only to Pro subscribers, eliminates the need to rely on browser-based or third-party translation tools, saving you time and effort when accessing foreign-language content.

    While Inoreader’s Pro version offers many additional features, the free version is sufficient for building a simple, clutter-free news feed that delivers what you want, when you want it.

    MakeUseOf

    How to Separate First and Last Names to Columns in Excel

    https://static1.makeuseofimages.com/wordpress/wp-content/uploads/2024/08/a-laptop-with-an-excel-spreadsheet-displaying-a-list-of-names-separated-into-first-names-and-last-names.jpg

    Separating first and last names in an Excel spreadsheet is a common task that can be time-consuming if done manually. Thankfully, Excel offers several efficient methods to automate this process, saving you valuable time and effort.

    Use Delimiters in Excel

    One of the most straightforward ways to separate first and last names is by using delimiters in Excel. A delimiter is a character that separates different parts of text data. In the case of names, the space between the first and last names often serves as the delimiter.

    To separate first and last names using delimiters:

    1. Select the column in your Excel spreadsheet containing the full names you want to split.
    2. Go to the Data tab and click Text to Columns in the Data Tools group.
    3. Select Delimited and click Next.
    4. Tick the Space checkbox. If a different delimiter, such as a comma or hyphen, separates the first and last names, select the appropriate option. Then, click Next.
    5. Excel will overwrite the original data in the same column by default. To keep the original data intact, specify a different column in the Destination field.
    6. Click Finish to confirm.

    Excel will split the first and last names into two columns. You can also use this to separate first, middle, and last names.

    Use the TEXTSPLIT Formula

    Another easy way to separate first and last names is using the TEXTSPLIT Excel function. This formula allows you to split text into multiple columns or rows based on a specified delimiter. Here’s how to use it.

    1. In the column that contains the full names you want to split, note down the cell address of the full name. Let’s say it’s in cell A3.
    2. Go to the cell where you want the first name to appear.
    3. Type =TEXTSPLIT(A3, " ") and press Enter.
    4. This formula will split the text in cell A3 wherever there is a space, placing the first name in the selected cell and the last name in the adjacent cell.
    5. To apply the formula to the entire column, drag the fill handle (the small square at the bottom-right corner of the selected cell) down to cover all the needed rows.

    Use a Keyboard Shortcut

    If you’re not keen on using Excel functions or formulas to separate first and last names, there’s a quicker way to get the job done. Excel’s Flash Fill feature, triggered with the Ctrl + E keyboard shortcut, can also help you separate first and last names into columns.

    Flash Fill is a powerful tool that automatically fills in data when it detects a pattern in your input, making it perfect for separating names. Here’s how to use it:

    1. Ensure that your data is in a single column with full names.
    2. In the cell where you want to extract the first name, manually type the first name from the first full name in column A. If A3 contains John Doe, type John in B3.
    3. Select the cell where you entered the first name and press Ctrl + E.
    4. Excel will automatically detect the pattern and fill down the first names for the entire column.
    5. Similarly, in the next column, type the last name corresponding to the first full name in column A. If A3 is John Doe, you would type Doe in C3.
    6. With C3 selected, press Ctrl + E again.

    Excel will automatically populate the last names for all the rows based on the detected pattern. Knowing how to separate first and last names in Excel can help you better organize your spreadsheet. With methods such as Text to Columns, Flash Fill, and Excel formulas, you can select the approach that best suits your needs.

    MakeUseOf

    We can now watch Grace Hopper’s famed 1982 lecture on YouTube

    https://cdn.arstechnica.net/wp-content/uploads/2024/08/hopper1-760×380.jpg

    Rear Admiral Grace Hopper on Future Possibilities: Data, Hardware, Software, and People (Part One, 1982).

    The late Rear Admiral Grace Hopper was a gifted mathematician and undisputed pioneer in computer programming, honored posthumously in 2016 with the Presidential Medal of Freedom. She was also very much in demand as a speaker in her later career. Hopper’s famous 1982 lecture on "Future Possibilities: Data, Hardware, Software, and People," has long been publicly unavailable because of the obsolete media on which it was recorded. The National Archives and Records Administration (NARA) finally managed to retrieve the footage for the National Security Agency (NSA), which posted the lecture in two parts on YouTube (Part One embedded above, Part Two embedded below).

    Hopper earned undergraduate degrees in math and physics from Vassar College and a PhD in math from Yale in 1930. She returned to Vassar as a professor, but when World War II broke out, she sought to enlist in the US Naval Reserve. She was initially denied on the basis of her age (34) and low weight-to-height ratio, and also because her expertise made her particularly valuable to the war effort. Hopper got an exemption, and after graduating first in her class, she joined the Bureau of Ships Computation Project at Harvard University, where she served on the Mark I computer programming staff under Howard H. Aiken.

    She stayed with the lab until 1949 and was next hired as a senior mathematician by Eckert-Mauchly Computer Corporation to develop the Universal Automatic Computer, or UNIVAC, the first computer. Hopper championed the development of a new programming language based on English words. "It’s much easier for most people to write an English statement than it is to use symbols," she reasoned. "So I decided data processors ought to be able to write their programs in English and the computers would translate them into machine code."

    Her superiors were skeptical, but Hopper persisted, publishing papers on what became known as compilers. When Remington Rand took over the company, she created her first A-0 compiler. This early achievement would one day lead to the development of COBOL for data processors, which is still the major programming language used today.

    “Grandma COBOL”

    In November 1952, the UNIVAC was introduced to America by CBS news anchor Walter Cronkite as the presidential election results rolled in. Hopper and the rest of her team had worked tirelessly to input voting statistics from earlier elections and write the code that would allow the calculator to extrapolate the election results based on previous races. National pollsters predicted Adlai Stevenson II would win, while the UNIVAC group predicted a landslide for Dwight D. Eisenhower. UNIVAC’s prediction proved to be correct: Eisenhower won over 55 percent of the popular vote with an electoral margin of 442 to 89.  

    Hopper retired at age 60 from the Naval Reserve in 1966 with the rank of commander but was subsequently recalled to active duty for many more years, thanks to congressional special approval allowing her to remain beyond the mandatory retirement age. She was promoted to commodore in 1983, a rank that was renamed "rear admiral" two years later, and Rear Admiral Grace Hopper finally retired permanently in 1986. But she didn’t stop working: she became a senior consultant to Digital Equipment Corporation and "goodwill ambassador," giving public lectures at various computer-related events.

    One of Hopper’s best-known lectures was delivered to NSA employees in August 1982. According to an NSA press release, the footage had been preserved in a defunct media format—specifically, two 1-inch AMPEX tapes. The agency asked NARA to retrieve that footage and digitize it for public release, and NARA did so. The NSA described it as "one of the more unique public proactive transparency record releases… to date."

    Hopper was a very popular speaker not just because of her pioneering contributions to computing, but because she was a natural raconteur, telling entertaining and often irreverent war stories from her early days. And she spoke plainly, as evidenced in the 1982 lecture when she drew an analogy between using pairs of oxen to move large logs in the days before large tractors, and pairing computers to get more computer power rather than just getting a bigger computer—"which of course is what common sense would have told us to begin with." For those who love the history of computers and computation, the full lecture is very much worth the time.

    Grace Hopper on Future Possibilities: Data, Hardware, Software, and People (Part Two, 1982).

    Listing image by Lynn Gilbert/CC BY-SA 4.0

    Ars Technica – All content