PHP & Web Development Blogs

Search Results For: libraries
Showing 1 to 5 of 11 blog articles.
12939 views · 6 years ago
Five Composer Tips Every PHP Developer Should Know

Composer is the way that that PHP developers manage libraries and their dependencies. Previously, developers mainly stuck to existing frameworks. If you were a Symfony developer, you used Symfony and libraries built around it. You didn’t dare cross the line to Zend Framework. These days however, developers focus less on frameworks, and more on the libraries they need to build the project they are working on. This decoupling of projects from frameworks is largely possible because of Composer and the ecosystem that has built up around it.

Like PHP, Composer is easy to get started in, but complex enough to take time and practice to master. The Composer manual does a great job of getting you up and running quickly, but some of the commands are involved enough so that many developers miss some of their power because they simply don’t understand.

I’ve picked out five commands that every user of Composer should master. In each section I give you a little insight into the command, how it is used, when it is used and why this one is important.

1: Require

Sample:

$ composer require monolog/monolog


Require is the most common command that most developers will use when using Composer. In addition to the vendor/package, you can also specify a version number to load along with modifiers. For instance, if you want version 1.18.0 of monolog specifically and never want the update command to update this, you would use this command.

$ composer require monolog/monolog:1.18.0


This command will not grab the current version of monolog (currently 1.18.2) but will instead install the specific version 1.18.0.

If you always want the most recent version of monolog greater than 1.8.0 you can use the > modifier as shown in this command.

$ composer require monolog/monolog:>1.18.0


If you want the latest in patch in your current version but don’t want any minor updates that may introduce new features, you can specify that using the tilde.

$ composer require monolog/monolog:~1.18.0


The command above will install the latest version of monolog v1.18. Updates will never update beyond the latest 1.18 version.

If you want to stay current on your major version but never want to go above it you can indicate that with the caret.

$ composer require monolog/monolog:^1.18.0


The command above will install the latest version of monolog 1. Updates continue to update beyond 1.18, but will never update to version 2.

There are other options and flags for require, you can find the complete documentation of the command here.

2: Install a package globally

The most common use of Composer is to install and manage a library within a given project. There are however, times when you want to install a given library globally so that all of your projects can use it without you having to specifically require it in each project. Composer is up to the challenge with a modifier to the require command we discussed above, global. The most common use of this is when you are using Composer to manage packages like PHPUnit.

$ composer global require "phpunit/phpunit:^5.3.*"


The command above would install PHPUnit globally. It would also allow it to be updated throughout the 5.0.0 version because we specified ~5.3.* as the version number. You should be careful in installing packages globally. As long as you do not need different versions for different projects you are ok. However, should you start a project and want to use PHPUnit 6.0.0 (when it releases) but PHPUnit 6 breaks backwards compatibility with the PHPUnit 5.* version, you would have trouble. Either you would have to stay with PHPUnit 5 for your new project, or you would have to test all your projects to make sure that your Unit Tests work after upgrading to PHPUnit 6.

Globally installed projects are something to be thought through carefully. When in doubt, install the project locally.

3: Update a single library with Composer

One of the great powers of Composer is that developers can now easily keep their dependencies up-to-date. Not only that, as we discussed in tip #1, each developer can define exactly what “up-to-date” means for them. With this simple command, Composer will check all of your dependencies in a project and download/install the latest applicable versions.

$ composer update


What about those times when you know that a new version of a specific package has released and you want it, but nothing else updated. Composer has you covered here too.

$ composer update monolog/monolog


This command will ignore everything else, and only update the monolog package and it’s dependencies.

It’s great that you can update everything, but there are times when you know that updating one or more of your packages is going to break things in a way that you aren’t ready to deal with. Composer allows you the freedom to cherry-pick the packages that you want to update, and leave the rest for a later time.

4: Don’t install dev dependencies

In a lot of projects I am working on, I want to make sure that the libraries I download and install are working before I start working with them. To this end, many packages will include things like Unit Tests and documentation. This way I can run the unit Tests on my own to validate the package first. This is all fine and good, except when I don’t want them. There are times when I know the package well enough, or have used it enough, to not have to bother with any of that.

Many packages create a distribution package that does not contain tests or docs. (The League of Extraordinary Packages does this by default on all their packages.) If you specify the --prefer-dist flag, Composer will look for a distribution file and use it instead of pulling directly from github. Of course if you want want to make sure you get the full source and all the artifacts, you can use the --prefer-src flag.

5: Optimize your autoload

Regardless of whether you --prefer-dist or --prefer-source, when your package is incorporated into your project with require, it just adds it to the end of your autoloader. This isn’t always the best solution. Therefore Composer gives us the option to optimize the autoloader with the --optimize switch. Optimizing your autoloader converts your entire autoloader into classmaps. Instead of the autoloader having to use file_exists() to locate a file, Composer creates an array of file locations for each class. This can speed up your application by as much as 30%.

$ composer dump-autoload --optimize


The command above can be issued at any time to optimize your autoloader. It’s a good idea to execute this before moving your application into production.

$ composer require monolog/monolog:~1.18.0 -o


You can also use the optimize flag with the require command. Doing this every time you require a new package will keep your autoloader up-to-date. That having said, it’s still a good idea to get in the habit of using the first command as a safety net when you roll to production, just to make sure.

BONUS: Commit your composer.lock

After you have installed your first package with composer, you now have two files in the root of your project, composer.json and composer.lock. Of the two, composer.lock is the most important one. It contains detailed information about every package and version installed. When you issue a composer install in a directory with a composer.lock file, composer will install the exact same packages and versions. Therefore, by pulling a git repo on a production server will replicate the exact same packages in production that were installed in development. Of course the corollary of this is that you never want to commit your vendor/ directory. Since you can recreate it exactly, there is no need to store all of that code in your repo.

It is recommended that also commit your composer.json. When you check out your repo into production and do an install, composer will use the composer.lock instead of the composer.json when present. This means that your production environment is setup exactly like your development environment.
24897 views · 6 years ago
Making Charts and Graphs using Laravel

Installing composer

Composer is a package management tool for PHP. Laravel requires composer for installation. We can download composer from https://getcomposer.org/download/

After installation that you can test whether composer installed or not by command
composer

Installing Laravel

The current stable version of laravel is laravel 5.6. We can install laravel package with three ways.

In command prompt or terminal by running composer global require "laravel/installer" and then Laravel new

or

We can create the project with Composer by running composer create-project --prefer-dist laravel/laravel

or

Directly clone from github
git clone https://github.com/laravel/laravel/tree/master and after that composer update

Laravel local development server

Run the below command in command prompt or terminal
PHP artisan serve


Above command will start to local development servehttp://localhost:8000 or if you want to change default port:

php artisan serve --port 


Generating charts and graphs

We are using consoletvs package for generating charts. So for installation we can first move inside to our project using command prompt or terminal. We are following the below steps to install

Step 1:

First we need to install ConsoleTVs/Charts composer package inside our laravel project.
composer require consoletvs/charts


Step 2:

After successfully installation of above package, open app/config.php and add service provider.
In config/app.php


'providers' => [
....
ConsoleTVs\Charts\ChartsServiceProvider::class,
],


After the service provider we need to add alias
'aliases' => [
....
'Charts' => ConsoleTVs\Charts\Facades\Charts::class,
]



Step 3

We need to configure of database for application. We can configure in either .env file or config/database.php file.


Step 4

We can migrate our default tables that is user. We can find the table in database/migration folder.

Step 5

We can generate dummy records for demo in users table. For creating dummy records, we need to run the below command in command prompt or terminal
php artisan tinker>>> factory(App\User::class, 20)->create();

the above command will create a set of 20 records.

If we need to add more records we need to run the above command or we can increase the count as much as we want. For example
php artisan tinker>>> factory(App\User::class, 2000)->create();


Step 6Creating controller

For creating controller we need to run below command in terminal or command prompt
php artisan make controller:<controller_name>


Step 7Adding the routes

We can add the routes for navigating our application. You can find routes file inside routes folder. Before 5.4 we can find routes.php file itself, now its web.php. If you are using laravel 5.2 routes.php will inside app/http folder.

So inside web.php:

Route::get('create-chart/{type}','ChartController@makeChart');


Here type will be the parameter we are passing and it will focus to makeChart() function inside chartcontroller

Step 8

Import charts to controller, for that in the namespace section add:

Use charts;


Step 9

We can put the below code into chartController

public function makeChart($type)
{
switch ($type) {
case 'bar':
$users = User::where(DB::raw("(DATE_FORMAT(created_at,'%Y'))"),date('Y'))
->get();
$chart = Charts::database($users, 'bar', 'highcharts')
->title("Monthly new Register Users")
->elementLabel("Total Users")
->dimensions(1000, 500)
->responsive(true)
->groupByMonth(date('Y'), true);
break;
case 'pie':
$chart = Charts::create('pie', 'highcharts')
->title('HDTuto.com Laravel Pie Chart')
->labels(['Codeigniter', 'Laravel', 'PHP'])
->values([5,10,20])
->dimensions(1000,500)
->responsive(true);
break;
case 'donut':
$chart = Charts::create('donut', 'highcharts')
->title('HDTuto.com Laravel Donut Chart')
->labels(['First', 'Second', 'Third'])
->values([5,10,20])
->dimensions(1000,500)
->responsive(true);
break;
case 'line':
$chart = Charts::create('line', 'highcharts')
->title('HDTuto.com Laravel Line Chart')
->elementLabel('HDTuto.com Laravel Line Chart Lable')
->labels(['First', 'Second', 'Third'])
->values([5,10,20])
->dimensions(1000,500)
->responsive(true);
break;
case 'area':
$chart = Charts::create('area', 'highcharts')
->title('HDTuto.com Laravel Area Chart')
->elementLabel('HDTuto.com Laravel Line Chart label')
->labels(['First', 'Second', 'Third'])
->values([5,10,20])
->dimensions(1000,500)
->responsive(true);
break;
case 'geo':
$chart = Charts::create('geo', 'highcharts')
->title('HDTuto.com Laravel GEO Chart')
->elementLabel('HDTuto.com Laravel GEO Chart label')
->labels(['ES', 'FR', 'RU'])
->colors(['#3D3D3D', '#985689'])
->values([5,10,20])
->dimensions(1000,500)
->responsive(true);
break;
default:
break;
}
return view('chart', compact('chart'));
}


Step 10

Create a blade file. Blade is the view file used inside the laravel. You can add new blade file with any name with an extension of .blade.php
Here we are creating chart.blade.php

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>My Charts</title>
{!! Charts::styles() !!}
</head>
<body>

<div class="app">
<center>
{!! $chart->html() !!}
</center>
</div>

{!! Charts::scripts() !!}
{!! $chart->script() !!}
</body>
</html>


Step 11

We can run our laravel application in local development server by php artisan serve command:

http://localhost:8000/create-chart/bar
http://localhost:8000/create-chart/pie
http://localhost:8000/create-chart/donut
http://localhost:8000/create-chart/line
http://localhost:8000/create-chart/area
http://localhost:8000/create-chart/geo



In the above example we was creating line chart, geo chart, bar chart, pie chart, donut chart, line chart and area chart. We can also create gauge chart, progressbar chart, areaspline chart, scatter chart, percentage chart etc using consoletvs charts composer package.

There are a lot of jQuery libraries also available like amcharts, chartjs, highcharts, google, material, chartist, fusioncharts, morris, plottablejs etc. However, using this plugin we can easily create charts without having to use jQuery, another advantage to building it in with Laravel.
28604 views · 5 years ago
Introduction to Gitlab CI for PHP developers
As a developer, you've probably at least heard something about CI - Continuous integration. And if you haven't - you better fix it ASAP, because that's something awesome to have on your skill list and can get extremely helpful in your everyday work. This post will focus on CI for PHP devs, and specifically, on CI implementation from Gitlab. I will suppose you know the basics of Git, PHP, PHPUnit, Docker and unix shell. Intended audience - intermediate PHP devs.
Adding something to your workflow must serve a purpose. In this case the goal is to automate routine tasks and achieve better quality control. Even a basic PHP project IMO needs the following:
* linter) checks (cannot merge changes that are invalid on the syntax level)
* Code style checks
* Unit and integration tests
All of those can be just run eventually, of course. But I prefer an automated CI approach even in my personal projects because it leads to a higher level of discipline, you simply can't avoid following a set of rules that you've developed. Also, it reduces a risk of releasing a bug or regression, thus improving quality.
Gitlab is as generous as giving you their CI for free, even for your private repos. At this point it is starting to look as advertising, therefore a quick comparison table for Gitlab, Github, Bitbucket. AFAIK, Github does not have a built-in solution, instead it is easily integrated with third parties, of which Travis CI seems to be the most popular - I will therefore mention Travis here.

Public repositories (OSS projects). All 3 providers have a free offer for the open-source community!


| Provider | Limits |
|---|---|
| Gitlab | 2,000 CI pipeline minutes per group per month, shared runners |
| Travis | Apparently unlimited |
| Bitbucket| 50 min/month, max 5 users, File storage <= 1Gb/month |

Private repositories


| Provider | Price | Limits |
|---|---|---|
| Gitlab | Free | 2,000 CI pipeline minutes per group per month, shared runners |
| Travis | $69/month | Unlimited builds, 1 job at a time |
| Bitbucket| Free | 50 min/month, max 5 users, File storage <= 1Gb/month |

Getting started

I made a small project based on Laravel framework and called it "ci-showcase". I work in Linux environment, and the commands I use in the examples, are for linux shell. They should be pretty much the same on Mac and nearly the same on Windows though.
composer create-project laravel/laravel ci-showcase

Next, I went to gitlab website and created a new public project: https://gitlab.com/crocodile2u/ci-showcase. Cloned the repo and copied all files and folders from the newly created project - the the new git repo. In the root folder, I placed a .gitignore file:
.idea
vendor
.env

Then the .env file:
APP_ENV=development

Then I generated the application encryption key: php artisan key:generate, and then I wanted to verify that the primary setup works as expected: ./vendor/bin/phpunit, which produced the output OK (2 tests, 2 assertions). Nice, time to commit this: git commit &amp;&amp; git push

At this point, we don't yet have any CI, let's do something about it!

Adding .gitlab-ci.yml

Everyone going to implement CI with Gitlab, is strongly encouraged to bookmark this page: https://docs.gitlab.com/ee/ci/README.html. I will simply provide a short introduction course here plus a bit of boilerplate code to get you started easier.
First QA check that we're going to add is PHP syntax check. PHP has a built-in linter, which you can invoke like this: php -l my-file.php. This is what we're going to use. Because the php -l command doesn't support multiple files as arguments, I've written a small wrapper shell script and saved it to ci/linter.sh:
#!/bin/sh
files=<code>sh ci/get-changed-php-files.sh | xargs</code>last_status=0
status=0
# Loop through changed PHP files and run php -l on each
for f in "$files" ; do message=<code>php -l $f</code> last_status="$?" if [ "$last_status" -ne "0" ]; then # Anything fails -> the whole thing fails echo "PHP Linter is not happy about $f: $message" status="$last_status" fi
done
if [ "$status" -ne "0" ]; then echo "PHP syntax validation failed!"
fi
exit $status

Most of the time, you don't actually want to check each and every PHP file that you have. Instead, it's better to check only those files that have been changed. The Gitlab pipeline runs on every push to the repository, and there is a way to know which PHP files have been changed. Here's a simple script, meet ci/get-changed-php-files.sh:
#!/bin/sh
# What's happening here?
#
# 1. We get names and statuses of files that differ in current branch from their state in origin/master.
# These come in form (multiline)
# 2. The output from git diff is filtered by unix grep utility, we only need files with names ending in .php
# 3. One more filter: filter *out* (grep -v) all lines starting with R or D.
# D means "deleted", R means "renamed"
# 4. The filtered status-name list is passed on to awk command, which is instructed to take only the 2nd part
# of every line, thus just the filename
git diff --name-status origin/master | grep '\.php$' | grep -v "^[RD]" | awk '{ print }'

These scripts can easily be tested in your local environment ( at least if you have a Linux machine, that is ;-) ).
Now, as we have our first check, we'll finally create our .gitlab-ci.yml. This is where your pipeline is declared using YAML notation:
# we're using this beautiful tool for our pipeline: https://github.com/jakzal/phpqa
image: jakzal/phpqa:alpine
# For this sample pipeline, we'll only have 1 stage, in real-world you would like to also add at least "deploy"
stages: - QA
linter:
stage: QA
# this is the main part: what is actually executed
script: - sh ci/get-changed-php-files.sh | xargs sh ci/linter.sh

The first line is image: jakzal/phpqa:alpine and it's telling Gitlab that we want to run our pipeline using a PHP-QA utility by jakzal. It is a docker image containing PHP and a huge variety of QA-tools. We declare one stage - QA, and this stage by now has just a single job named linter. Every job can have it's own docker image, but we don't need that for the purpose of this tutorial. Our project reaches Step 2. Once I had pushed these changes, I immediately went to the project's CI/CD page. Aaaand.... the pipeline was already running! I clicked on the linter job and saw the following happy green output:
Running with gitlab-runner 11.9.0-rc2 (227934c0) on docker-auto-scale ed2dce3a
Using Docker executor with image jakzal/phpqa:alpine ...
Pulling docker image jakzal/phpqa:alpine ...
Using docker image sha256:12bab06185e59387a4bf9f6054e0de9e0d5394ef6400718332c272be8956218f for jakzal/phpqa:alpine ...
Running on runner-ed2dce3a-project-11318734-concurrent-0 via runner-ed2dce3a-srm-1552606379-07370f92...
Initialized empty Git repository in /builds/crocodile2u/ci-showcase/.git/
Fetching changes...
Created fresh repository.
From https://gitlab.com/crocodile2u/ci-showcase * [new branch] master -> origin/master * [new branch] step-1 -> origin/step-1 * [new branch] step-2 -> origin/step-2
Checking out 1651a4e3 as step-2...
Skipping Git submodules setup
$ sh ci/get-changed-php-files.sh | xargs sh ci/linter.sh
Job succeeded

It means that our pipeline was successfully created and run!

PHP Code Sniffer.

PHP Code Sniffer is a tool for keeping app of your PHP files in one uniform code style. It has a hell of customizations and settings, but here we will only perform simple check for compatibilty with PSR-2 standard. A good practice is to create a configuration XML file in your project. I will put it in the root folder. Code sniffer can use a few file names, of which I prefer phpcs.xml:
<?xml version="1.0"?>
/resources

I also will append another section to .gitlab-ci.yml:
code-style:	stage: QA	script:	# Variable $files will contain the list of PHP files that have changes	- files=<code>sh ci/get-changed-php-files.sh</code> # If this list is not empty, we execute the phpcs command on all of them - if [ ! -z "$files" ]; then echo $files | xargs phpcs; fi

Again, we check only those PHP files that differ from master branch, and pass their names to phpcs utility. That's it, Step 3 is finished! If you go to see the pipeline now, you will notice that linter and code-style jobs run in parallel.

Adding PHPUnit

Unit and integration tests are essential for a successful and maintaiable modern software project. In PHP world, PHPUnit is de facto standard for these purposes. The PHPQA docker image already has PHPUnit, but that's not enough. Our project is based on Laravel, which means it depends on a bunch of third-party libraries, Laravel itself being one of them. Those are installed into vendor folder with composer. You might have noticed that our .gitignore file has vendor folder as one of it entries, which means that it is not managed by the Version Control System. Some prefer their dependencies to be part of their Git repository, I prefer to have only the composer.json declarations in Git. Makes the repo much much smaller than the other way round, also makes it easy to avoid bloating your production builds with libraries only needed for development.
Composer is also included into PHPQA docker image, and we can enrich our .gitlab-ci.yml:
test:	stage: QA	cache:	key: dependencies-including-dev	paths: - vendor/	script:	- composer install	- ./vendor/bin/phpunit

PHPUnit requires some configuration, but in the very beginning we used composer create-project to create our project boilerplate.laravel/laravel package has a lot of things included in it, and phpunit.xml is also one of them. All I had to do was to add another line to it:
xml

APP_KEY enironment variable is essential for Laravel to run, so I generated a key with php artisan key:generate.
git commit & git push, and we have all three jobs on theQA stage!

Checking that our checks work

In this branch I intentionally added changes that should fail all three job in our pipeline, take a look at git diff. And we have this out from the pipeline stages:Linter:
$ ci/linter.sh
PHP Linter is not happy about app/User.php:
Parse error: syntax error, unexpected 'syntax' (T_STRING), expecting function (T_FUNCTION) or const (T_CONST) in app/User.php on line 11
Errors parsing app/User.php
PHP syntax validation failed!
ERROR: Job failed: exit code 255

**Code-style**:
$ if [ ! -z "$files" ]; then echo $files | xargs phpcs; fi
FILE: ...ilds/crocodile2u/ci-showcase/app/Http/Controllers/Controller.php
----------------------------------------------------------------------
FOUND 0 ERRORS AND 1 WARNING AFFECTING 1 LINE
---------------------------------------------------------------------- 13 | WARNING | Line exceeds 120 characters; contains 129 characters
----------------------------------------------------------------------
Time: 39ms; Memory: 6MB
ERROR: Job failed: exit code 123

**test**:
$ ./vendor/bin/phpunit
PHPUnit 7.5.6 by Sebastian Bergmann and contributors.
F. 2 / 2 (100%)
Time: 102 ms, Memory: 14.00 MB
There was 1 failure:
1) Tests\Unit\ExampleTest::testBasicTest
This test is now failing
Failed asserting that false is true.
/builds/crocodile2u/ci-showcase/tests/Unit/ExampleTest.php:17
FAILURES!
Tests: 2, Assertions: 2, Failures: 1.
ERROR: Job failed: exit code 1

Congratulations, our pipeline is running, and we now have much less chance of messing up the result of our work.

Conclusion

Now you know how to set up a basic QA pipeline for your PHP project. There's still a lot to learn. Pipeline is a powerful tool. For instance, it can make deployments to different environments for you. Or it can build docker images, store artifacts and more! Sounds cool? Then spend 5 minutes of your time and leave a comment, you can also tell me if there is a pipeline topic you would like to be covered in next posts.
7957 views · 5 years ago
Midwest PHP and Nomad PHP Join Forces!


Interested in sponsoring? Check out the prospectus



A little history

Several years ago I had the distinct privilege of founding Midwest PHP with Jonathan Sundquist. The goal was simple, to bring an affordable PHP conference to Minnesota and the midwest region.

Midwest PHP was created for one simple reason - there weren't a lot of alternatives, especially affordable ones. At the time, your choices were ZendCon in Silicon Valley, php[tek] in Chicago, or Northeast PHP in Boston. While Northeast PHP formed the blueprint of a community conference - it still required a flight and a costly hotel in Boston. I wanted something where local attendees, college students, and those just beginning in their PHP careers could go to learn, network, and become part of the PHP community.

Shortly after Midwest PHP was formed (originally we were using the name PHPFreeze - until Sundquist told me what a horrible idea it was), Adam Culp launched Sunshine PHP which has become one of the top community focused PHP conferences (but still requires that flight and hotel in Miami). Sundquist and I knew that any reasonable developer would still prefer to attend a conference in a blizzard than enjoy the beautiful Floridian weather (ok, that might not be it, but we still understood the need that existed).

After moving to California for my new job, Jonathan Sundquist continued to run Midwest PHP as more community conferences appeared. With his efforts, and the torch being passed to Mike Willbanks, Midwest PHP celebrated it's seventh consecutive year, becoming the longest continuously running PHP conference (if you go by formed date, if you go by actual conference date Sunshine PHP beats us out by a month).


A renewed focus


Developers at Midwest PHP

Because of the incredible work Jonathan and Mike have done, Midwest PHP has stood the test of time - and the peaks and valleys that come with any conference. With the shifts in the PHP community and the sad loss of several community conferences - we realized the need for Midwest PHP is more now than ever, and to meet that need we needed to reimagine the way the conference operated.

We also realized that the best way to make Midwest PHP accessible was to combine forces, creating a seamless partnership between Nomad PHP and Midwest PHP. Through this partnership we're not only able to stream the event to make it more accessible ($19.95/mo), but also expand the conference.

This year, taking place onApril 2-4, 2020 - Midwest PHP will bring together over 800 developers both in-person and virtually! Making this year truly unique, however, and staying with our purpose of helping new developers be part of the PHP community is abrand new, FREE, beginner track. I'm excited to say we will be giving away 200 tickets to those wishing to attend our Beginner or Learn PHP track!!!

We will also work to keep prices as low as possible as we offer our standard PHP tracks (Everyday PHP and PHP Performance & Security) starting at $250/ person, anda brand new enterprise track geared at developers facing challenges at unprecedented scale starting at $450/ person.

Last but not least, it is our goal with the help of our sponsors to include the workshop day as part of your ticket price - allowing you to get one day of in-depth training, and two more full days of sessions. On top of this, we're also excited to make the Nomad PHP and Nomad JS video libraries available for Standard and Enterprise attendees, providing over 220 additional virtual sessions on demand!


For sponsors

Sponsoring a conference is hard. We understand the challenge of gauging ROI, planning travel, and coordinating outreach. With the combined forces of Midwest PHP and Nomad PHP, we're able to offer sponsors unique plans that maximize their investment - while ensuring the funds go back into the event to create an amazing experience for our attendees.

Beyond Midwest PHP's goal to be the largest PHP conference this year - the included Nomad PHP advertising will help you reach a much larger and broader audience, allowing for follow up advertisements and consistent engagement with the PHP community.


Interested in sponsoring? Check out the prospectus



Next steps

For more information, please visit the Midwest PHP website. The venue, call for papers, and additional information will all be posted there soon.
13456 views · 5 years ago
Why Cloudways is the Perfect Managed Hosting for PHP Applications

The following is a sponsored blogpost by Cloudways


Developing an application is not the sole thing you should bank on. You must strive to find the best hosting solution to deploy that application also. The application’s speed is dependent on the hosting provider, that is why I always advise you to go for the best hosting solution to get the ultimate app performance.

Now a days, it is a big challenge to choose any web hosting, as each hosting has its own pros and cons which you must know, before considering it finally for the deployment. I don’t recommend shared hosting for PHP/Laravel based applications, because you always get lot of server hassles like downtime, hacking, 500 errors, lousy support and other problems that are part and parcel of shared hosting.

For PHP applications, you must focus on more technical aspects like caching, configs, databases, etc. because these are essential performance points for any vanilla or framework-based PHP application. Additionally, if the app focuses on user engagement (for instance, ecommerce store), the hosting solution should be robust enough to handle spikes in traffic.

Here, I would like to introduce Cloudways PHP server hosting to you which provides easy, developer and designer friendly managed hosting platform. With Cloudways, you don't need to focus on PHP hosting, but must focus on building your application. You can easily launch cloud servers on five providers including DigitalOcean, Linode, Vultr, AWS and GCE.


Cloudways ThunderStack


Being a developer, you must be familiar with the concept of stack - an arrangement of technologies that form the underlying hosting solution.

To provide a blazing fast speed and a glitch-free performance, Cloudways has built a PHP stack, known as ThunderStack. This stack consists of technologies that offer maximum uptime and page load speed to all PHP applications. Check out the following visual representation of ThunderStack and the constituent technologies:


alt_text


As you can see, ThunderStack comprises of a mix of static and dynamic caches with two web servers, Nginx and Apache. This combination ensures the ultimate experience for the users and visitors of your application.


Frameworks and CMS


The strength and popularity of PHP lies in the variety of frameworks and CMS it offers to the developers. Realizing this diversity, Cloudways offers a hassle-free installation of major PHP frameworks including Symfony, Laravel, CakePHP, Zend, and Codeigniter. Similarly, popular CMS such as WordPress, Bolt, Craft, October, Couch, and Coaster CMS - you can install these with the 1-click option. The best part is that if you have a framework or CMS that is not on the list, you can easily install it through Composer.


1-Click PHP Server & Application Installation


Setting up a stack on an unmanaged VPS could take an entire day!

When you opt for Cloudways managed cloud hosting, the entire process of setting up the server, installation of core PHP files and then the setup of the required framework is over in a matter of minutes.

Just sign up at Cloudways, choose your desired cloud provider, and select the PHP stack application.


alt_text


As you can see, your LAMP stack is ready for business in minutes.

Many PHP applications fail because essential services are either turned off or not set up properly. Cloudways offers a centralized location where you can view and set the status of all essential services such as:



* Apache
* Elasticsearch
* Memcached
* MySQL
* PHP-FPM
* Nginx
* New Relic
* Redis
* Varnish


alt_text


Similarly, you can manage SMTP add-ons without any fuss.


Staging Environment


With Cloudways, you can test your web applications for possible bugs and errors before taking it live.

Using the staging feature, developers can first deploy their web sites on test domains where they can analyze the applications performance and potential problems. This helps site administrators to fix those issues timely and view the application performance in real-time.

A default sub domain comes pre-installed with the newly launched application, making it easy for the administrators to test the applications on those testing subdomains. Overall, it's a great feature which helps developers know about the possible errors that may arise during the live deployment.

alt_text

Pre-Installed Composer & Git


PHP development requires working with external libraries and packages. Suppose you are working with Laravel and you need to install an external package. Since Composer has become the standard way of installing packages, it comes preinstalled on the Cloudways platform. Just launch the application and start using Composer in your project.

Similarly, if you are familiar with Git and maintain your project on GitHub or BitBucket, you don’t need to worry about Git installation. Git also comes pre-configured on Cloudways. You can start running commands right after application launch.


Cloudways MySQL Manager


When you work with databases in PHP, you need a database manager. On the Cloudways platform, you will get a custom-built MySQL manager, in which you can perform all the tasks of a typical DB manager.

alt_text


However, if you wish to install and use another database manager like PHPMyAdmin, you can install it by following this simple guide on installing PHPMyadmin.


Server & Application Level SSH


If you use Linux, you typically use SSH for accessing the server(s) and individual applications. A third-party developer requires application and server level access as per the requirements of the client. Cloudways offers SSH access to fit the requirements of the client and users.

alt_text


PHP-FPM, Varnish & Cron Settings


Cloudways provides custom UI panel to set and maintain PHP-FPM and Varnish settings. Although the default configuration is already in place, you can easily change all the settings to suit your own, particular development related requirements. In Varnish settings, you can define URL that you want to exclude from caching. You can also set permissions in this panel.

alt_text


Cron job is a very commonly used component of PHP application development process. On Cloudways platform, you can easily set up Cron jobs in just a few clicks. Just declare the PHP script URL and the time when the script will run.

alt_text


Cloudways API & Personal Assistant Bot


Cloudways provides an internal API that offers all important aspects of the server and application management. Through Cloudways API, you can easily develop, integrate, automate, and manage your servers and web apps on Cloudways Platform using the RESTful API. Check out some of the use cases developed using Cloudways API. You just need your API key and email for authentication of the HTTP calls on API Playground and custom applications.

alt_text


Cloudways employs a smart assistant named CloudwaysBot to notify all users about server and application level issues. CloudwaysBot sends the notifications on pre-approved channels including email, Slack and popular task management tools such as Asana and Trello.


Run Your APIs on PHP Stack


Do you have your own API which you want to run on the PHP stack? No problem, because you can do that, too with Cloudways! You can also use REST API like Slim, Silex, Lumen, and others. You can use APIs to speed up performance and require fast servers with lots of resources. So, if you think that your API response time is getting slower due to the large number of requests, you can easily scale your server(s) with a click to address the situation.


Team Collaboration


When you work on a large number of applications with multiple developers, you need to assign them on any specific application. Cloudways provides an awesome feature of team collaboration through which you can assign developers to specific application and give access to them. You can use this tool to assign one developer to multiple applications. Through team feature, you can connect the team together and work on single platform. Access can be of different type; i.e. billing, support and console. You can either give the full access or a limited one by selecting the features in Team tab.

alt_text


Final Words


Managed cloud hosting ensures that you are not bothered by any hosting or server related issues. For practical purposes, this means that developers can concentrate on writing awesome code without worrying about underlying infrastructure and hosting related issues. Do sign up and check out Cloudways for the best and the most cost-effective cloud hosting solution for your next PHP project!

SPONSORS

PHP Tutorials and Videos