In the previous article Setting up Nginx on a Debian server as front-end for Apache of the series of articles for Drupal sysadmins we explained Nginx configs that allow it working through static queries while Apache serves dynamic content. This article offers a look at an alternative setup, where Php-fpm takes the place of Apache. The operating principle for our web server will be as follows:
This installment of the Drupal-friendly server series covers the process of setting up MySQL to work flawlessly with Drupal. Previous article of the series described the nuances of tuning nginx web server, now is the time to deal with the database. The OS of choice for us is Debian.
Installing MySQL 5.5 or 5.6
First off, we need to update the package list:
Hi, fellow Drupalers. Today, I’d like to share with you a project of ours that allows automating web server setup for Drupal-powered websites.
At Drupal Admin team, we often set up and fine tune servers for Drupal websites, so it was only natural for us to develop a routine that automates the related processes. We picked Ansible configurations management system for initial setups and further servers maintenance.
Why Ansible? This systems allows:
Welcome to the next installment of the series of articles for Drupal sysadmins. Today, you are going to learn the process and nuances of setting up Nginx so it works as Apache’s front-end on a Debian server.
In the previous article, we covered setup of a web server on a Debian machine and Drupal installation. The solution offered there has a couple of drawbacks:
Today i published the first article from the curriculum of Drupal Admin sysadmin training organized by our team.
What does it take to set up a web server from scratch and install Drupal to that server?
Let’s find it out.
The web server we will be using runs under Linux Debian / Ubuntu and has the standard set of Apache, MySql, Php on board, all with default configs. We will also consider basic set up of Drupal and some nuances about that environment.
Today’s post is a quick tip that will help you keep your robots.txt after you update the site’s Drupal core with drush.
Most likely, you know the problem, too: each time you update the Drupal core (with drush, first of all), you lose all changes you may have introduced to robots.txt manually. A good idea is to always have a backup of this file handy, but every now and then you only have an outdated version of robots.txt and you learn that only when you check the updated site with webmaster tools.
Security of a website is a crucial thing that sometimes does not receive the attention it should.
Today, I’d like to share the routines we apply when checking up security of a Drupal-powered website. For the most part, this article is a summary of the report Dmitry Kochetov, our Drupal security specialist, made at DrupalCamp Krasnodar 2016.
You have most likely been there too. The website is nothing out of the ordinary, quite a small one, but the space it occupies on the hard drive is no match to the outward size, which boosts backup costs the customer pays. The most obvious reason for the discrepancy is unused files, we thought, and so started looking for them. In this article, I tell the story of the quest for unused files in a Drupal-powered website (and not just Drupal, for that matter).
Today, I'd like to share with you a Drupal module I made, the Performance Monitor. I covered it in my keynote at DrupalCamp Krasnodar 2016.
Let's start with the reasons why I coded this one in the first place.
At Initlab, we often receive queries (mostly from web developers) that boil down to a couple of typical issues: the site is slow after migration to another server/hosting; the site is slow and it feels like the server should be optimized for Drupal.