Drupal News

This is all about Drupal

  • Drupal Planet

    Security advisories: Drupal core - Critical - Third-party libraries - SA-CORE-2021-001

    1 day 6 hours ago
    Project: Drupal coreDate: 2021-January-20Security risk: Critical 18∕25 AC:Complex/A:User/CI:All/II:All/E:Exploit/TD:UncommonVulnerability: Third-party librariesDescription: 

    The Drupal project uses the pear Archive_Tar library, which has released a security update that impacts Drupal. For more information please see:

    Exploits may be possible if Drupal is configured to allow .tar, .tar.gz, .bz2, or .tlz file uploads and processes them.

    Solution: 

    Install the latest version:

    Versions of Drupal 8 prior to 8.9.x are end-of-life and do not receive security coverage.

    Disable uploads of .tar, .tar.gz, .bz2, or .tlz files to mitigate the vulnerability.

    Reported By: Fixed By: 

    OpenSense Labs: Comparing web services implementations: REST vs JSON:API vs GraphQL

    1 day 10 hours ago
    Comparing web services implementations: REST vs JSON:API vs GraphQL Gurpreet Kaur Wed, 01/20/2021 - 18:38

    People today do not like to be confined, if I talk about development teams, they would hold up flags stating the same. Since development and innovation go hand in hand and constraint is the biggest enemy of innovation, you can’t tell me they are wrong to have that notion. 

    Talking specifically about web developments, there are a lot of areas to explore and a lot of technologies to help you do that. So, why limit yourself, when you don't have to? Drupal has brought such an impressive trend forward that has simply satiated the developer’s desire for innovation and that is the headless approach

    Unlike before, when your entire project had to be nestled inside one CMS, Drupal now gives you the opportunity to explore new technologies to your heart’s desire. This is possible because the presentation layer and the backend content become two separate entities. Drupal acts as the content repository and a frontend technology of your liking takes care of, of course, the frontend part of website architecture.

    To provide a connection between the separated development aspects of the project, enters the API. An API layer is a necessity when going headless, because it transmits all the information from the front to the backend and vice-versa. 

    And the three available APIs in Drupal, REST, JSON and GraphQL, are the reason behind me writing this blog. Although the purpose of all three is the same, they are quite different from one another. Today, we would be highlighting their meanings, their pros and cons and all the visible distinctions they have. So, let’s begin. 

    Decoding the APIs 


    REST, JSON and GraphQL bring in a similar outcome when they are used for decoupling Drupal. Yes, they are different too. And we would get into the difference between REST, JSON and GraphQL soon. Before that it is essential to understand their history, origin and what they were intended for because the differences actually start from there. 

    REST 

    REST was developed by Roy Fielding in the year 2000, the purpose behind its development was to provide a software architectural design for APIs. In simple terms, it provided an easy path for one computer to interact with another by utilising an HTTP protocol. The communication between the two computers is not stored on the server, meaning it is stateless; rather the client sessions are stored on a client-side server. 

    There are six constraints necessary to implement REST in the complete sense. 

    • It needs a separated client and server; 
    • It needs to be able to make independent calls;
    • It needs to able to store cacheable data;
    • It needs to have a uniform interface;
    • It is a layered system; 
    • Finally, it needs a code-on-demand. 

    REST offers a great deal of functionality without a lot of effort. For instance, if you are working on someone else’s RESTful API, you would not need a special library or special initialisation. Yes, your developers need to design their own data model using REST, but the HTTP conventions at play make programming a breeze. 

    To know how REST plays a key role in decoupling Drupal, read our blog REST APIs in Drupal.

    JSON: API  

    JSON stands for JavaScript Object Notation. Built in May of 2013, it was designed as an encoding scheme, eliminating the need for ad-hoc code for every application to communicate with servers, which use a defined way for the same. JSON: API, like the name says, is a specification for building APIs using JSON. 

    With JSON: API, communication between the server and the client becomes extremely convenient. It not only formats the way a request should be written, but the responses also come in a formatted manner. The primary aim of JSON: API is to lessen the number of requests and shrink the size of the package, all using HTTP protocol. 

    Broadly stated; 

    • JSON reduces the number of requests and amount of data being transmitted; 
    • It requires zero configuration; 
    • It uses the same JSON access scheme for every piece of data, making caching very effective;
    • It offers quite a few features and gives you, as the client, the opportunity to turn them on or off. 

    To know how JSON:API plays a key role in decoupling Drupal, read our blog, JSON API in Drupal.

    GraphQL 

    While JSON can work alongside REST, GraphQL was designed as an alternate to it and some of its inconveniences. Built in 2012 by Facebook, it acts as a cross-platform data query and manipulation language. Its servers are available in numerous popular languages, being Java, JavaScript, Ruby, Python, C#, amongst others. 

    The features of GraphQL are that; 

    • It allows users to request data from multiple resources in a single request.
    • It can be used to make ad-hoc queries to one endpoint and access all the needed data.
    • It gives the client the opportunity to specify the exact type of data needed from the server. 
    • All of these add to its predictable data structure, making it readable as well as efficient. 

    It was in 2015, after GraphQL was open-sourced that it became truly popular. Now its development is governed by The GraphQL Foundation, which is hosted by the Linux Foundation. 

    To know how GraphQL plays a key role in decoupling Drupal, read our blog, GraphQL in Drupal.

    Now that we know the basics of all the three APIs, let us have a look at their popularity status, before beginning the comparison. 

    A glimpse at the popularity of the three APIs. Source: State of API Report 2020REST vs JSON vs GraphQL 

    Now let’s get down to the details and understand why choosing one over the other two could be in your best interest. Let’s start with the differences between REST, JSON:API and GraphQL.

    How efficient is the data retrieval?


    One of the most important aspects for an API is the way its fetches data. It could require one or more requests. Therefore, this aspect is also referred to as its request efficiency. Getting multiple data responses in a single request has to be an ideal, so let’s see how REST, JSON: API and GraphQL here. 

    REST 

    The REST API is innately built to capitalise one resource per request. This works perfectly as long as you only need to retrieve a single piece of data like an article. However, if you need more than that, the number of requests you would have to type in separately would be equivalent to the amount of data you need. 

    One article = one request 
    Two articles = two requests
    Two articles and the author information stored in a different field = Two requests for the articles + a long wait for the completion of those requests + two additional requests for the author information. 

    This sums up REST’s request efficiency to the T. You require to be equipped to handle a number of requests, which can ultimately stall your user experience, making it seem to go at a snail’s pace. No sugar-coating here, there are going to be a lot of round trips. 

    And the problem with a lot of round trips is a lot of extra information you do not even need. This is because there is a possibility that a REST API endpoint might not have the required data for an application. As a result, the said application will not get everything it needs in a single trip, making multiple trips the only option. It's safe to say that REST over-fetches and the verbose responses can be a problem.

    JSON: API 

    JSON: API does not suffer from the multiple request conundrum. One single request can give you everything you want, be it one article, two or ten along with the author’s information, I kid you not. 

    This is possible because JSON: API implements a concept called ‘sparse fields.’ What this does is list the desired resource fields together for easy fetching. You can have as many fields as possible. If you feel the fields are too long and would not be cacheable, you can simply omit a few sparse fieldsets to cache the request. 

    Another thing to remember is that the servers can choose sensible defaults, so your developers would need to be a little diligent to avoid over-fetching. 

    GraphQL 

    Coming to GraphQL, it was also designed in a similar fashion to JSON: API and is competent enough to eliminate the problem of over-fetching and avoid sending multiple requests. 

    GraphQL has its own queries, schema and resolvers that aid the developers in creating API calls with particular data requirements in mind. Moreover, by mandating clear-cut additions to every resource field in every query and ensuring the developers cannot skip any of it,  it is able to avoid multiple round trips. Thereby, making over-fetching information a thing of the past. 

    The only problem here can be that the queries may become too large, and consequently, cannot be cached. 

    How is the code executed?


    Using an API for calls involves the execution of a code on the server. This code helps in computing, calling another API or loading data from a database. All three of the APIs use a code, however, the code is implemented varies a little.

    REST 

    Route handlers are utilised for execution upon a REST call. These are basically functions for specific URLs. 

    • First the server receives the call and retrieves the URL path and GET; 
    • Then the functions are noted and the servers begins finding the same by matching GET and the path; 
    • After that the result is generated, since the server would have executed the function; 
    • In the final step, once the result is serialised by the API library, it is ready for the client to see. 

    GraphQL

    GraphQL operates in a relatively similar manner. The only difference is that it uses functions for a field within a type, like a Query type, instead of using functions for specific URLs.  

    Route handlers are replaced by resolvers in GraphQL, they are still functions though.

    • After the call is made and the server has received a request, the GraphQL query is retrieved. 
    • The query is then examined and the resolver is called upon for every field. 
    • Finally, the result is added to the response by the GraphQL library and it is ready for the client to see. 

    It should be noted that GraphQL offers much more flexibility as multiple fields can be requested in one request, and the same field can be called multiple times in one query.  The fact they let you know where you performance needs fine-tuning makes resolvers excellent trackers as well. 

    This is simply not possible in REST and JSON. Do you see the difference in implementation? 

    How do the API endpoints play a role?

    Many a time, it is seen that once the API is designed and the endpoints are sealed, the applications require frontend iterations that cannot be avoided. You must know that the endpoints aid an application to receive the required data just by accessing it quickly in the view, so you could call them essential even. 

    However, the endpoints can pose a bit of a problem for the iterations, especially when they need to be quick. Since, in such an instance, changes in the API endpoints have to be made for every change in the frontend, the backend gets tedious for no reason at all. The data required for the same can be on the heavier side or the lighter side, which ultimately hampers the productivity. 

    So, which API offers the solution?

    It is neither REST, nor JSON. GraphQL’s flexibility makes it easy for the developers to write queries mentioning the specific data needs along with iterations for the development of the frontend, without the backend having to bear the brunt.

    Moreover, GraphQL’s queries help developers on retrieving specific data elements and provide insights to the user as to which elements are popular and which aren’t amongst the clients.  

    Why doesn’t REST? 

    The answer is simple, REST has the entire data in a single API endpoint. Being a user, you won’t be able to gain insights on the use of specific data as the whole of it always returned. 

    How good is the API exploration?


    Understanding your API and knowing about all of its resources and that too quickly and with ease is always going to benefit your developers. In this aspect, all three perform pretty contrastingly. 

    REST 

    REST gives a lacklustre performance in API exploration to be honest. The interactivity is pretty substandard as the navigation links are seldom available. 

    In terms of the schema, it would only be programmable and validatable, if you are going to be using the OpenAPI standard. The auto-generation of the documentation also depends on the same. 

    JSON: API 

    JSON performs better than REST. The observation of the available field and links in JSON: API’s responses helps in its exploration and makes its interactivity quite good. You can explore it using a web browser, cURL or Postman

    Browsing from one resource to the next, debugging or even trying to develop on top of an HTTP-based API, like REST, can be done through a web browser alongside JSON. 

    GraphQL 

    GraphQL is indeed the front-runner here. It has an impressive feature, known as the GraphiQL, due to which its API exploration is unparalleled. It is an in-browser IDE, which allows the developers to create queries repeatedly. 

    What is even more impressive is the fact the queries get auto-completed based on the suggestions it provides and you get real-time results. 

    Let’s focus on schema now 


    Schemas are important for the development team, the frontend and the backend equally. It is because once a schema has been defined, your team would know the data structure and can work in parallel. Creating dummy test data as well as testing the application would be easy for the frontend developers. All in all, the productivity and efficiency levels elevate. 

    REST

    REST does have an associated expected resource schema since it is a set of standard verbiage. Despite this, there is nothing that is specifically stated in them. 

    JSON: API 

    In terms of schema validation and programming, it does define a generic one, however, a reliable field-level schema is yet to be seen. Simply put, JSON is basic with regards to schema. 

    GraphQL

    The fact that GraphQL functions completely on schemas makes it a pro in this regard. The schema used here is Schema Definition Language or SDL. What this means is that GraphQL uses a type system that sets out the types in an API because all the types are included in SDL. Thus, defining the way a client should access data on the server becomes easy. 

    To conclude this point, I would want to say that when there is immense complexity in the schema and resource relationships, it can pose a disadvantage for the API.  

    How simple is to operate it? 


    Operating an API essentially involves everything, from installing and configuring it to scaling and making it secure. REST, JSON: API and GraphQL, all perform well enough to make themselves easy to operate. Let’s see how. 

    REST 

    REST is quite simple to use, a walk in the park for a pro developer. It is because REST is dependent on the conventional HTTP verbiage and techniques. You would not need to transform the underlying resources by much, since it can be supported by almost anything. It also has a lot of tools available for the developers, however, these are dependent on their customisation before they can be implemented. 

    In terms of scaling, REST is extremely scalable, handling high traffic websites is no problem at all. To take advantage of the same, you can make use of a reverse proxy like Varnish or CDN. Another plus point of REST is that it has limited points of failure, being the server and the client. 

    JSON: API 

    JSON: API is more or less the same as REST in terms of its operational simplicity, so much so that you can move from REST to JSON: API without any extensive costs. 

    • It also relies on HTTP; 
    • It is also extremely scalable; 
    • It also has numerous developer tools, but unlike REST, JSON: API does not need customised implementations; 
    • Lastly, JSON also has fewer failure points. 

    GraphQL 

    GraphQL is the odd one out here. It isn’t as simple to use as the other two. It necessitates specific relational structure and specific mechanisms for interlocking. You would be thinkin that how is this complex? Let me ask you to focus on word specific, what this means is that you might need to restructure your entire API with regards to resource logic. And you must know that such restructuring would cost you time, money and a boatload of efforts. 

    Even in terms of scalability, GraphQL does not fare very well. The most basic requests also tend to use GET requests. For you to truly capitalise GraphQL, your servers would need their own tooling. If I talk about the points of failure here, even those are many, including client, server, client-side caching and client and build tooling. 

    What about being secure?


    The kind of security an API offers is also an important consideration in choosing it. A drastic difference is noted in REST and GraphQL. Let’s see what that is. 

    REST 

    REST is the most secure amongst the three. The intrinsic security features in REST are the reason for the achievement. 

    • There are different APU authentication methods, inclusive of HTTP authentication;
    • There are the JSON Web Tokens for sensitive data in HTTP headers;
    • There are also the standard OAuth 2.0 mechanisms for sensitive data in JSON structure. 

    JSON:API

    JSON:API is on a similar footing to REST in terms of security. The reason being the fact that like REST it exposes little resources.  

    GraphQL 

    It is not like GraphQL is not secure, it is; however, the security has to be manually attained. It is not secure by default and it is not as mature as REST in this regard. 

    When the user has to apply authentication and authorisation measures on top of data validation, the chances of unpredictable authorisation checks rise. Now, do I have to tell you that such an event is bound to jeopardise your security? 

    How is the API design pinpointed?


    If an API has to perform well for every use case, you have to make it do so. By creating such design choices that are a result of your understanding of the users’ needs. You cannot just go with the flow, evaluating how your users are going to be interacting with your API and getting an understanding of the same is key for your API’s design. 

    REST

    For REST, this exercise of deciphering the user requirements must happen before the API can be implemented. 

    GraphQL 

    As for GraphQL, this apprehension can be delayed a little. By profiling the queries, you would be able to tell their complexity level and pinpoint the sluggish queries to get to an understanding of user’s consumption of the API. 

    What about their use in Drupal?  


    Drupal is an important player when it comes to building websites and managing their content. With decoupling Drupal becoming more and more popular, it has become crucial to understand how the APIs perform alongside Drupal. 

    REST 

    Talking about the installation and configuration of REST, it can be complicated at best. The fact that the REST module has to be accompanied by the REST UI module does not ease the complexity. 

    With REST, the clients that cannot create queries with the needed filters on their own, since the REST module does not support client-generated collection queries. This is often referred to as decoupled filtering.  

    JSON:API 

    JSON:API module landed in Drupal core in Drupal 8.7. JSON:API’s configuration is as easy as ABC, there is simply nothing to configure. JSON is a clear winner in this aspect. 

    Moving to client-generated queries, JSON does offer its clients this luxury. They can generate their own content queries and they won't need a server-side configuration for the same. JSON’s ability to manage access control mechanisms offered by Drupal make changing an incoming query easy. This is a default feature in JSON:API.

    GraphQL 

    The installation of GraphQL is also not as complicated as REST, but it isn’t as easy as JSON as well. This is because it does mandate some level of configuration from you. 

    Similar to JSON, GraphQL also offers decoupled filtering with client generated queries. A less common trend amongst GraphQL projects is seeking permissions for persisted queries over client-generated queries; entailing a return to the conventional Views-like pattern.

    In addition to these three major Drupal web services, explore other alternatives in the decoupled Drupal ecosystem worthy of a trial. Read everything about decoupled Drupal, the architecture-level differences between headless and traditional setups, different ways to decouple Drupal, and the skills required to build a Decoupled Drupal web application to know more.

    Concluding with the basics 

    To sum up, let us look at the fundamental nature of the three APIs, which entails two aspects; simplicity and functionality. 

    In terms of simplicity, REST is a winner. A second would be rewarded to JSON, while GraphQL would not and could not be described as simple, complex and that too immensely with major implementations coming your way would be a more accurate description. In terms of functionality, GraphQL does offer the most. If you choose JSON over GraphQL, you would end up parting with some of its features unfound in JSON. 

    All three are powerful and efficient in what they can do for your application. The question is how much complexity are you willing to take up with that power?

    blog banner blog image JSON:API REST API GraphQL Drupal Web Services Blog Type Articles Is it a good read ? On

    Amazee Labs: Contributing 12 Patches in 12 Months… again!

    1 day 15 hours ago
    <img src="https://www.amazeelabs.com/sites/default/files/styles/leading_image/public/images/current-affairs/12-Patches-12-Months.jpg?h=994a2424&amp;itok=ifURidYR" width="1120" height="630" alt="Contributing 12 Patches in 12 Months in 2021" title="Contributing 12 Patches in 12 Months… again!" class="image-style-leading-image" /> In 2019, I set myself a goal of contributing one patch each month to Drupal and that's how #12months12patches came to be a Slack channel at Amazee Labs. The results of that challenge are reflected in this blog post from last year.

    BADCamp News: Drupal Global Contrib Weekend at San Francisco Drupal User Group - Introduction to issue forks and merge requests

    1 day 22 hours ago
    Drupal Global Contrib Weekend at San Francisco Drupal User Group - Introduction to issue forks and merge requests Wed, 01/20/2021 - 12:00 volkswagenchick Tue, 01/19/2021 - 16:31 Next week kicks off Drupal Global Contribution Weekend, January 29-31, a virtual worldwide event everyone can participate in from anywhere in the world. Want to give back to the Drupal Community in the form of code but you're not acquainted with the new contrib process? Here’s your chance to get ready for the weekend. Now that our meetups are online, join the San Francisco community in learning how to create issue forks and merger requests. Drupal Planet

    Dcycle: Adding continuous integration (CI) to your workflow

    1 day 23 hours ago

    This post is aimed at web development teams and is not tied to a specific technology. We will aim to not get more technical than is needed, but rather to explore what Continuous integration (CI) is, and how it can help save teams money within a month of it being set up.

    What is continuous integration?

    Although several definitions of CI have been proposed, we will use the following definition in the context of this post:

    Cotinuous integration (CI) is the practice of running any number of tests, automatically, on a project, periodically and/or whenever the code changes. For CI practitioners, the number one priority is for tests to always be passing.

    A simple example, please

    Here is the very simplest example I can think of:

    Let’s say you’re maintaining an old-school HTML website (no fancy stuff like databases or PHP), your team may decide to set up CI to make sure a file called “index.html” exists in your codebase: if it exists, your test passes; if it is absent, your test fails.

    Checks may be run every time your code is changed.

    Your team might store code on GitHub, and link a cloud CI provider such as CircleCI to your codebase, having it trigger every time your code changes.

    You will then define a script which is your definition of “what it means for your your codebase to pass”: checking for the existence of “index.html” is a one-line script.

    A more complex example

    Although the example above has value, it is very simple, and you may soon find yourself wanting to add higher-value tests to your script. This ability to add complexity over time is a powerful feature of CI: getting started is simple, and you can add as many tests as you want over time depending on your available resources.

    Let’s say your team is maintaining a Drupal or Wordpress codebase with lots of complex code, your team may set up a CI server that:

    • checks for broken links on the live environment every so often;
    • checks every few minutes that the live environment is responding and has some expected keywords on its front page;
    • every so often, checks that the live environment adheres to certain Accessibility standards;
    • every so often, checks that the live environment is not reporting any errors;
    • on every code change, perform some static analysis on custom PHP code: for example, that a function which expects an array as an argument is never called with a string.
    • on every code change, make sure PHP code adheres to coding standards (for example, functions should have comments; and indenting should be correct).
    • on every code change, create a dummy Drupal or Wordpress site with a dummy database and make sure your site fires up, and run some end-to-end tests against it.
    • etc., etc.

    A cloud-based tool such as CircleCI can work well to check the codebase when it is changed; and a hosted tool such as Jenkins might be a good fit for running periodic checks (such as a sanity check making sure the production environment works).

    The above example corresponds to real-world checks I perform on lost of projects I maintain; and both CircleCI and Jenkins are tools I have been using for years.

    So how much does all this cost?

    “How much does this cost?” is actually the wrong question. “How much can I save?” is a better way of putting it. Consider the following graph, the horizontal axis is time, and the vertical axis is cumulative project cost.

    // set the dimensions and margins of the graph var margin = {top: 10, right: 30, bottom: 30, left: 60}, width = 460 - margin.left - margin.right, height = 400 - margin.top - margin.bottom; // append the svg object to the body of the page var svg = d3.select("#ci") .append("svg") .attr("width", width + margin.left + margin.right) .attr("height", height + margin.top + margin.bottom) .append("g") .attr("transform", "translate(" + margin.left + "," + margin.top + ")"); //Read the data d3.csv("/data/posts/2021-01-13/ci.csv", function(data) { // group the data: I want to draw one line per group var sumstat = d3.nest() // nest function allows to group the calculation per level of a factor .key(function(d) { return d.name;}) .entries(data); // Add X axis --> it is a date format var x = d3.scaleLinear() .domain(d3.extent(data, function(d) { return d.year; })) .range([ 0, width ]); svg.append("g") .attr("transform", "translate(0," + height + ")") .call(d3.axisBottom(x).ticks(0).tickFormat(() => "")); // Add Y axis var y = d3.scaleLinear() .domain([0, d3.max(data, function(d) { return +d.n; })]) .range([ height, 0 ]); svg.append("g") .call(d3.axisLeft(y).ticks(0).tickFormat(() => "")); // color palette var res = sumstat.map(function(d){ return d.key }) // list of group names var color = d3.scaleOrdinal() .domain(res) .range(['#e41a1c','#377eb8','#4daf4a','#984ea3','#ff7f00','#ffff33','#a65628','#f781bf','#999999']) // Draw the line svg.selectAll(".line") .data(sumstat) .enter() .append("path") .attr("fill", "none") .attr("stroke", function(d){ return color(d.key) }) .attr("stroke-width", 1.5) .attr("d", function(d){ return d3.line() .x(function(d) { return x(d.year); }) .y(function(d) { return y(+d.n); }) .curve(d3.curveMonotoneX) (d.values) }) }) // https://stackoverflow.com/questions/19787925 d3.svg.axis().tickValues([])
    • The red line is business as usual: because we are not maintaining CI scripts or setting up tests, the up-front cost is low. But eventually you’ll lose control of your codebase and spend all your time putting out fires (I call this the “technical debt wall”).
    • The blue line is the CI approach, higher up-front cost to set things up, but eventually you’ll get less errors.
    • Where the two lines intersect, I call the “sweet spot”. That’s when you start saving money. Your “sweet spot” is not months or years away: I firmly believe it should happen within a month. If it takes longer than a month, you’re overengineering your CI system.
    So what are these up-front costs?

    The up-front costs are:

    • Creating a simple script which defines what it means for your code “to work”. If you find this intimidating, just have your script check for a file that must be present, as in the simple example presented earlier.
    • Make sure your code is tracked in GitHub or BitBucket.
    • Make sure your entire team accepts the principle that making tests pass is the number one priority. This is crucial. If you start accepting failing tests, then CI becomes a useless burden. This also means every member of your team must agree with every test that is performed. If a test is not important enough to warrant dropping everything when it fails, then you should not have that test in your codebase.
    • Integrate a simple, free CI cloud provider like CircleCI and make sure it works.

    All of the above, together, can take between an hour and a day.

    How about the ongoing costs?

    Ongoing costs are closely relate to the complexity of your CI setup. If you are just testing for an “index.html” file, your ongoing costs are close to zero, but may include:

    • dealing with errors and updates in the CI script itself. Don’t forget the CI script is computer code, and like any computer code, it needs to be maintained.
    • updating the CI script to deal with API changes in the cloud CI provider.
    • fixing false negatives. For example, someone may change the filename from index.html to index.htm, which might require you to fix your test script to also test for index.htm in addition to index.html.
    • onboarding new team members to understand the importance of making sure tests always are passing.

    If your tests are super simple (such as checking that an “index.html” file exists), the above costs are low, probably less than one hour a month.

    If your tests are complex (as in our second example, above), you might set aside 5 to 10 hours a month for ongoing costs.

    Obviously, if your ongoing costs are higher than your savings, then you are “over-testing”.

    So what are the benefits?

    The fundamental trick of CI is to keep your benefits higher than your costs. Let’s go back to our simple “index.html” example:

    • We have already established that there are minimal up-front and ongoing costs.
    • There are also ongoing savings: once you know that your index.html file is guaranteed to exist, your manual testing time decreases.
    • The cost in lost revenue, lost confidence, and debugging time in case someone accidentally deletes index.html from your website would be considerable high.

    Based on the above, you can conclude whether it’s worth implementing CI.

    Continuous improvement of your CI setup

    Checking for “index.html” is probably of very low value, but once you’ve done that, you’ve also set up the foundation to improve your script. Every time you feel your CI script has a positive cost-benefit ratio, it is time to improve your CI script. In practice, I have found that in projects under active development, the CI setup gets constantly improved.

    Specifically, any time a problem makes its way to production, it should be a gut reaction to introduce a fix, along with a test to make sure the problem never happens again.

    The key is making incremental improvements, making sure your cost-benefit ratio is always positive.

    Docker and containerization

    Docker, and containerization generally, embed software and configuration in computer code along with your project code.

    The widespread adoption of Docker and containerization in recent years has been crucial for CI. Without containerization, let’s say you want to run PHP static analysis, start a database with a Drupal site, run end-to-end tests, you need to install a bunch of software on your CI server (or your laptop), make sure the versions and configuration are in sync with your local development setups. This is simply too expensive.

    Docker makes all this easy: simply put, Docker abstracts all the software and configuration, making software act the same on any computer that has Docker installed.

    If you are not using Docker and you’d like to see how simple this makes things, install and launch Docker Desktop on your computer, give it 6Gb RAM instead of the default 2Gb in its preferences, then you’ll be able to run all tests on my Drupal Starterkit project, without any additional fiddling with configuration of software:

    cd ~/Desktop && git clone https://github.com/dcycle/starterkit-drupal8site.git cd starterkit-drupal8site ./scripts/ci.sh

    It should take about 10 minutes to run all tests and it will not add any software to your computer; everything is done on throwaway “containers”. (In general, tests become a lot more frustrating to developers as they take longer to run; which is why I have a policy of not accepting tests which take more than 20 minutes to run.)

    The amount of software packages and configuration required to run all the tests in this example is enormous: database servers and configuration, passwords, permissions, PHPUnit, the right version of PHP and Apache or Nginx…; however it’s all defined in Docker files and in code, not on host computers.

    Which is why you can run the tests in three lines of code!

    This makes it possible to run these complex tests on your computer without installing any software other than Docker.

    This also makes it possible to run these exact tests, sans extra configuration, on CircleCI or other CI providers which support virtual machines with Docker preinstalled. In fact, that’s exactly what we’re doing with the Drupal Starterkit. CircleCI even provides a cute badge to indicate whether tests are passing.

    Click on the badge below to see test results on CircleCI, which should be identical to the results on your computer if you ran the the above script (you’ll need to log in with your GitHub or BitBucket account).

    Security

    Whether you are using a cloud service such as CircleCI, or hosting your own CI server with Jenkins or other software, be aware that it adds a potential attack vector for hackers, especially because by design, CI software needs access to your codebase.

    In early 2021, a vulnerability was discovered in JetBrains TeamCity (Widely Used Software Company May Be Entry Point for Huge U.S. Hacking, New York Times, January 6th, 2021) in relation to the major SolarWinds hack.

    Make sure you have a solid security policy, including the Principle of Least Privilege (POLP) and other industry-standard security approaches; also make sure your codebase, even if it’s private, does not contain any sensitive data, including API keys.

    Conclusion

    With continuous integration (CI), you can let computers do the grunt work of looking for bugs in your codebase, liberating your developers to do more productive work, reducing the number of bugs that make it into production, and increasing the level of confidence of all stakeholders in your software, and deploying frequently.

    And, above all, saving money.

    CI can be as simple or as complex as you need: start small, then let your CI process grow as your team becomes more comfortable with it.

    This post is aimed at web development teams and is not tied to a specific technology. We will aim to not get more technical than is needed, but rather to explore what Continuous integration (CI) is, and how it can help save teams money within a month of it being set up.

    Ben's SEO Blog: 7 SEO Goals For Business: Optimization For The Right Reasons

    2 days 8 hours ago

    Selecting the right SEO goals is a critical first step in your digital marketing campaign. As with any long-term endeavor, knowing your marketing end goals and working directly toward them saves time, money, and stress.

    Great SEO cannot be done in a vacuum. It depends on business goals, the needs of the sales and customer support team, and intimate knowledge of the competitive landscape in which you work. Engage the critical stakeholders in each area of your business and work together to come up with a list of needs. You may be surprised at how SEO can help each team meet its objectives.

    It’s important to identify and select the right goals before implementing any SEO strategy and get continual buy-in from your team. Combine and categorize ideas to find the ones that... Read the full article: 7 SEO Goals For Business: Optimization For The Right Reasons

    Agaric Collective: Drupal toolbar not working in dev environments in Firefox? Here's why.

    2 days 9 hours ago

    Drupal's toolbar second level of menu options and dropdown not showing? Look for "Uncaught DOMException: The quota has been exceeded." errors, as viewable in the Firefox web console. If you see them, the problem is likely due to sites sharing a top-level domain—which is likely if you are using a local development environment like DDEV, and you working on way too many sites at once—combined with a pretty bad Firefox bug that will be fixed in the next release.

    To quote Nathan Monfils:

    1. Everything from your public domain (abc.tld) counts against your quota, even if it is in a seemingly unrelated subdomain (e.g. my-app.example.com and intranet.example.com).
    2. The quota is not recomputed properly, requiring a firefox restart after clearing your data on other subdomains

    Note this may affect all sorts of applications, not just Drupal, when you have them running on multiple subdomains of the same top-level domain. So this isn't just about local development environments (and i dislike that DDEV shares their own top-level domain across all the instances you are working on, and while it can be changed i've accepted its way of doing things so i'm on the same page with other developers by default).

    Sure, closing more tabs and restarting Firefox could (predictably) have fixed this—and a lot else that's wrong with me, according to everyone i know—but why do that when i can open more tabs and learn precisely how broken everything around me really is?

    Read more and discuss at agaric.coop.

    Specbee: Drupal 9.1 and its compatibility with PHP 8 – Learn what’s new and how to check compatibility

    2 days 12 hours ago
    Drupal 9.1 and its compatibility with PHP 8 – Learn what’s new and how to check compatibility Pradosh 19 Jan, 2021 Top 10 best practices for designing a perfect UX for your mobile app

    “Update before you get outdated”. 

    PHP 8 is here and is now supported in Drupal 9.1 and its dependencies! November 2020 saw the big release of PHP 8. We call it a big release because of the exciting new features and optimizations it comes loaded with (which we will be talking about shortly). 

    Drupal 8.9 and 9.0 are however marked incompatible with PHP 8. They are still compatible with PHP 7.3 and PHP 7.4 – which happens to be the last major PHP update. PHP 7.4 will stop receiving active support from the community from November 2021. And thus, updating your website to Drupal 9.1 will be a good idea now.

    Drupal 10, which is scheduled to release in June 2022, will mandate compatibility with PHP 8. Read on to find out about the amazing features PHP 8 has to offer and how you can check if your Drupal version is compatible with PHP 8.

      What’s new with PHP 8 (Notable Changes)     1. JIT Compiler

    JIT stands for just-in-time compilation. Starting from PHP 5.5, Zend VM became part of PHP. But 8.0 introduced JIT to address some long struggling PHP performance issues. For better performance PHP was based on OPCache using OPCode. This is precompiled code given to the processor as commands. However, it is not very native to the machine language. On the other hand, JIT provides actual machine code with a mechanism to work together with OPCache. JIT does not work automatically. We need to configure this in the php.ini file.

        2. The null safe operator

    You must be familiar with the null coalescing operator (??) which worked as:

    $value = $var1 ?? $var2

    It checks for the null value of $var1 and returns it. If it is not null, it returns $var2. But it does not work with method calls. Here, the null safe operator comes into the picture.

    $value = $obj->getData()?->getValue();

    Here you can call the getValue() method; even if no method $obj->getData() returns null, the code will not crash. It will return null. On the other hand, using the null coalescing operator:

    $value = $obj->getData()->getValue() ?? null; 

    ..will throw an error.

        3. Named argument

    PHP 8 allows you to now pass named arguments to functions. It does not depend upon the argument order. Instead, you can pass the argument name.

    function named_arg_example(String $arg1, $string $arg2, $string $arg3) {} named_arg_example( arg1: ‘arg1 value’, arg3: ‘arg3 value’, arg2: ‘arg2 value’, );     4. Match expression

    Match expression is like the switch statement, except that it does not require a break statement.

    $value = match($check) { 0 => ‘Value is zero’,   1, 2, 3 => ‘Value is non zero and less than 4’’ }

    There are many other great new features added to PHP 8 like the constructor property promotion, Attributes, Constant type errors for internal functions, Saner string to number comparison, etc. More details can be found here.

    How to perform a Compatibility Check with PHP 8 on Drupal

    You can use this method to check if your version of Drupal is compatible with PHP 8 or not. For that, you will need to first make sure you have the required package – phpcompatibility – which you can download here.

    Next, you should already be having Drupal installed. If not, you will need to install Drupal 9 in your system. Using composer to install Drupal is the recommended way. For information about composer installation please refer this document

    STEP 1: Drupal Installation

    Use this Composer command to install recommended version of Drupal

    composer create-project drupal/recommended-project [my_site_name_dir]

    You will need to change [my_site_name_dir] with the folder name you want to install Drupal into.

    STEP 2: Installing the required Package

    After installing Drupal, you will have composer.json in your Drupal root directory. Open it in text editor and add the following code:

    "require-dev": {     "phpcompatibility/php-compatibility": "*" },

    If you already have require-dev section in your composer.json file, just add

    "phpcompatibility/php-compatibility": "*"  to the list.

    Next, you need to provide location info to the PHP code sniffer by adding the following lines to composer.json

     "scripts": {     "post-install-cmd": "\"vendor/bin/phpcs\" --config-set installed_paths vendor/phpcompatibility/php-compatibility",     "post-update-cmd" : "\"vendor/bin/phpcs\" --config-set installed_paths vendor/phpcompatibility/php-compatibility" }

    And then run:

     composer update --lock

    It will install phpcompatibility and other required packages.

    STEP 3: Compatibility Check

    Now, use this command to check PHP compatibility for the project 

    vendor/bin/phpcs -p [directorypath] --standard=PHPCompatibility --runtime-set testVersion [php version] --extensions=[file extensions] --report-full==[path/to/report-file]

    You need to replace the directory path with the directory path that the test will run on. In our case it is ‘.’ because we want to run the test in current directory (All Drupal files and folder). You should also replace the [php version] with the version you want to check compatibility with - which in this case will be 8.0. Replace the [file extensions] with file extensions like php, module, inc, install, etc. The report-full gives you the flexibility to store the report log in a file. So you will need to provide the path for the log file.

    So, for our case, the command will be: 

    vendor/bin/phpcs -p . --standard=PHPCompatibility --runtime-set testVersion 8.0 --extensions=php,module,install,inc --report-full==./drupal9-php8-compatibility.txt

    It will take a few minutes and you will get a drupal9-php8-compatibility.txt file, where you can check the reported log.

    PHP 8 has been a major change as it has brought about some useful and attractive features and optimizations. Drupal has been adopting modern technologies and libraries since Drupal 8 and is riding along the technology wave ever since. Drupal 9.1 comes with adjustments that allows it to support Symfony 5.

    If you haven’t already migrated to Drupal 8, now is the time. Migrate to Drupal 8 and then enjoy easy updates to Drupal 9 and the subsequent versions.

    Drupal Planet Drupal 8 Drupal 9 Drupal 9 Module Drupal Tutorial Shefali ShettyApr 05, 2017 Subscribe For Our Newsletter And Stay Updated Subscribe

    Leave us a Comment

      Shefali ShettyApr 05, 2017 Recent Posts Image Drupal 9.1 and its compatibility with PHP 8 – Learn what’s new and how to check compatibility Image Drupal – Is it a CMS for Everyone? Image How to Integrate Google Tag Manager with Drupal 8 – An easy step-by-step tutorial Want to extract the maximum out of Drupal? TALK TO US Featured Success Stories

    A Drupal powered multi-site, multi-lingual platform to enable a unified user experience at SEMI.

    link

    Discover how our technology enabled UX Magazine to cater to their massive audience and launch outreach programs.

    link

    Discover how a Drupal powered internal portal encouraged the sellers at Flipkart to obtain the latest insights with respect to a particular domain.

    link

    Dropsolid: JAMStack, Cloud IDE’s & Privacy - tech that will shape 2021

    2 days 13 hours ago
    19 Jan Nick Veenhof

    A new year also means some new predictions. When it comes to innovation in the web technology industry, innovation is, as always, moving faster than expected.

    In this blog I will share a couple of items that we should keep an eye out for in the next 12 months within the web development world. As with any prediction, I can’t see in the future - so judge for yourself. :-)

    Cloud IDE’s will disrupt local development flows

    One of the first major innovations that I see happening in 2021 is the adoption of Cloud IDE’s. Cloud what? In short, a development environment that doesn’t need anything on your local computer to make changes to a specific project.

    Let’s take Drupal as an example, Drupal requires PHP, A form of database like MySQL, Caching technologies like Redis or Memcached, Reverse Caching technologies like Varnish and to top it of, indexing technologies like Apache Solr or Elasticsearch. Next to that, it also requires in many cases “xdebug” as a means of stepping through the code. And it isn’t finished yet, as we also need an editor or also known as IDE. Some very common IDE’s are PHPStorm or Visual Studio Code.

    For someone that is just starting out, this is a very daunting task, and most often requires help from senior profiles to get you started. Translate this to classroom tasks and you can see why more often than not, this kind of complexity is not taught. It would take more time to set it up on the variety of laptops around then the actual teaching. I’ve seen this firsthand myself, when I did some guest lectures.

    Luckily there are solutions out there that try to make local development easy, such as Dropsolid Launchpad, DDEV or LANDO but all of them still require certain knowledge and at the very least a powerful computer that can run Docker.

    Disruption wouldn’t be called disruption if this model could see itself being threatened. Cloud IDE’s are this disruption. Imagine logging in to your development platform and be able to click edit on the respective environment you would want to work upon. This button opens up a window with an IDE, tuned to your preferences and with all the extensions you need, and with all the connections already setup. You are able to make a change, enable debugging and go through the issue step by step, make a change, commit it and deploy. All within the browser and with the compute power greater than any consumer laptop.

    Some might say this will never take off - but then again, people said the same of video game streaming services like Stadia. Stadia is outperforming many local gaming rigs today at a performance level that has never been seen. If this works for games, I’m certain it works for development environments as well.

    Only time will tell.

    Systems that can combine JAMStack ideologies but also power very dynamic content flows will gain major ground.

    The majority of the web today is powered by monolithic applications such as Adobe Experience Manager, Sitecore but I also count Wordpress & Drupal to be in that category. This isn’t really a problem but there is a trend that should worry some of these systems.

    JAMStack stands for Javascript, API’s & Markup. It’s a weird acronym but basically it tries to push for the same level of separation as MVC (Model View Controller) paradigm but this time for the ever increasing complexity of the frontend stack.

    The Javascript galaxy is almost like a real galaxy. It is ever expanding, at a rate never seen before. It caused a massive shift in what “frontend” actually means. Does being a frontend developer mean you are really good in translating designs to CSS & Markup or is being a frontend developer mean you are really good in creating interactive UX patterns and interactive dialogs within the scope of the browser.

    I believe there is a great opportunity for CMS’s that are able to semi-decouple. Either they generate the content and they allow it to be styled or they allow components to be loaded in, and make use of the API endpoints. This flexibility is key, as you still want to allow content editors to make amazing landing pages in a WYSIWYG fashion, while also enabling great digital experiences using JAMstack components where it makes sense.

    Drupal is very strong in allowing any form of consumption of its content and is in the sweet spot of today's innovation. It allows itself to be used as the A in Jamstack, as it can be a pure API. But at the same time it allows for rapid prototyping and rapid site building using the known and tried methodologies. Over time I do think API’s and the tooling around for CMS’s will become more standard and perhaps follow a crossing path between GraphQL & Schema.org, so that also tools leveraging these API’s don’t have to be reinvented from scratch.

    First party data & caring about privacy will become the new normal

    For a long time, nobody cared about adding more SAAS services to a website. Services like Google Analytics, Hotjar, Intercom, etc… were added without even thinking about the concept of personal data. With the GDPR guidelines, but also the other privacy regulations all around the world, choosing such services comes with additional responsibilities but also liabilities for your brand and company. Not only that, but we have seen a major amount of data breaches all around the world from SAAS service providers. Sometimes this data is shared with 3rd party vendors in order to allow advertisements but also to combine data and form user profiles across multiple clients. The so-called “third party cookie

    This third party cookie is now seeing its demise. And it should have been abolished a long time ago. Adopting a service should not bring your users in danger, and certainly not without them knowing it. Today, Safari, Brave, Firefox all block third party cookies already. Google Chrome will stop doing so in 2022. I hope they reconsider that date, as it’s time to go to a more open and private web as soon as possible. Digital Marketers will still be able to personalize their sites, make use of advertisements but you will no longer put your customer at risk by sharing their data to a third party where you no longer have control over that data. Next to that, services that can guarantee that the captured data is owned by the brand will have additional benefits. And services that allow portability of these systems will win the Gold Medal. And more often than not, this innovation is led by open source systems such as Matamo or Apache Unomi.

    The ultimate goal, but I suspect this is not for 2021, is that the customer has full control over his or her own personal data at any given time. Tim Berners-Lee (the godfather of the world wide web) is creating technologies like Solid to enable this shift.

    With that in mind, 2021 will be a great year for services that can deliver on the promise of first party data while still enabling the Digital Marketeer all the tools necessary for optimizing the digital experience.

    In closing

    One thing won’t change in 2021 and that is the high bar customers have set for online brands. They will still expect a flawless journey throughout your online assets, and rightfully so. So you better get familiar with rapid delivery (Cloud IDE’s & Dropsolid Platform), great User Experience (Jamstack & Drupal) and a personalized experience with your online brand (DS Personalization). Curious how Dropsolid can get you ready for 2021? We’re here to help.

    Talk to us about your next steps

    Third & Grove: Ask a Core Maintainer Anything 2020

    2 days 21 hours ago

    We set up an internal ask me anything session with Nat, the Drupal core maintainer that works at TAG that we sponsor to contribute to Drupal core every day and let our engineering team ask him anything they wanted. We got into some very deep topics and had learned some things that quite surprised us. Below is a transcript of the most interesting bits of the conversation. 

    DrupalEasy: 10 fieldable entity types every Drupal developer should know about

    3 days 6 hours ago

    If you're a Drupal developer who designs information architecture (IA) for your organization and/or clients, then hopefully by now you're thinking in terms of entities, bundles, and fields, not limiting your thinking to only content types.

    This article isn't going to dive into a lesson explaining what entities, bundles, and fields are as there is plenty of great documentation about that.

    Back in the Drupal 7 and earlier days, it was common to look at an organization's data and map it almost exclusively to only content types (maybe a few vocabularies as well). With Drupal 8's and 9's Entity API now fully mature, it's time to check yourself and make sure you take into account all of the amazing entity types that are available in both Drupal core and well-used and -maintained contributed modules. 

    With that in mind, the next time you are designing the information architecture for a Drupal site, be sure to consider the following entity types.

    1. User (core) - one of the original core entity types - still fieldable, but still not bundleable. For that, use…
    2. Profile (contrib) - this useful module allows you to create various "profile types" that can be associated with each user. Examples include "author" profiles, "contributor" profiles, and "VIP" profile.
    3. Vocabulary (core) - another original core entity type (if it ain't broke…)
    4. Content type (core) - the original and still the best, but often overused. 
    5. Block type (core) - new in Drupal 8, replaces Drupal 7 modules like Bean and Boxes that provided custom, fieldable block types. Block types are great for lightweight, reusable content that doesn't need a dedicated path on your site. Block types are great for supporting content. 
    6. Media (core) - starting with Drupal 8.4, media entities are now part of Drupal core. These are incredibly useful fieldable entity types if your site includes things like PDF files or videos (both locally-hosted and remote). For example, no longer do you need to create a "Document" content type to upload documents that are related to various other entities on your site. 
    7. Paragraphs (contrib) - this popular and well-maintained contributed module allows authors to mix and match various "paragraph types" (fieldable entities) in an effort to create custom layouts of (often) nodes. In this author's opinion, Paragraphs module is best used as a WYSIWYG replacement for the body field, and not as an overall page layout tool. The power of Paragraphs module lies in the fact that a site designer can create and style various paragraph types that site authors can then utilize to provide creative layouts for their content. 
    8. Drupal Commerce (contrib) - another extremely well-maintained contributed module provides several entity types related to ecommerce, including product types, orders, and more. 
    9. Comment types (core) - new to Drupal 8, allows your site to have different types of comments. This can be very useful, but in our experience, not used all that often.
    10. Contact forms (core) - new to Drupal 8 and similar to the Drupal 7 Entityform module. The idea was to create a Webform-like entity type, but in our experience, Webform still continues to be a better solution in the vast majority of use cases.

    While this list isn't exhaustive, we believe these are the ones that most Drupal developers will most likely utilize. 

    Drupal Career Online, our 12-week, twice-a-week, online Drupal training program teaches not only most of all of these entity types, but also how to figure out when to use each one. We also focus on how to work with various project stakeholders in validating the IA design early in the development process in order to keep costs and future changes to a minimum. 
     

    OpenSense Labs: CMS and Static Site Generators

    3 days 6 hours ago
    CMS and Static Site Generators Gurpreet Kaur Mon, 01/18/2021 - 22:16

    Websites have entered a new playing field now, at least compared to what they used to be a few decades ago. They are not one-dimensional anymore. They represent a multitude of different business agendas that are essential for growth and visibility.

    Websites are not just limited to words, their world has widened progressively. From animations to social media integration, websites today can do it all. A major reason for these advancements in websites and their build is the software they are built on. And that is going to be the highlight of this blog.  

    We will talk about the Content Management Systems and the Static Site Generators and shed light on their uses, their suitability and whether they can work in sync or not? So let’s begin. 

    Understanding a CMS  Source: Opensource.com

    Commencing with the veterans, CMS or a Content Management System have been around for almost two decades (Drupal, one of the world leaders in web content management, was initially released on 15th January 2001). Despite being that old, the conventions they are built on and the features they have been added with over the years have resulted in CMSs being as modern as modern as can be. 

    From easing the workload off of the bloggers’ shoulders to making newspaper editors happy; from catering for corporations and their digital marketing team to aiding numerous government departments online and transparent, a CMS has a wide audience. 

    If I had to define a CMS, I would simply call it the one-stop destination for all your website’s content needs. It manages, organises and publishes web content. What is more impressive is that content authors can create, edit, contribute and publish on their own, they do not need to be dependent on developers for that. A CMS offers a collaborative environment to build and present websites, allowing multiple users to work with it at once. Terms like Web Content Management and Digital Experience Platform are being thrown around today and they are nothing, but a modern variant of a CMS. 

    Getting into the meaning of CMS a little further, you would hear two versions of it and they are essentially its break down. 

    • First would be the Content Management Application. This makes marketers, merchandisers and content creators self-reliant. They can do the contextual heavy-lifting on their own with a CMA without the requirement of a code, so, none of the guys or girls from IT would be needed. 
    • Next is the Content Delivery Application. This is basically the foundation for your content; the back-end aspect that placed your content into templates to be further presented as one website. So, what your audiences see is provided by the CDA. 

    Both of these together make a CMS whole for your use. 

    Moving further, after the meaning, it is time to get a brief understanding of the various categories of a CMS. Based upon different categorisations, there are seven in all.

    Based on the CMS’ role 

    Traditional 

    Most often, a traditional CMS is used on really simple marketing sites. I have used the term simple to describe it because it is just that, be it the layout or general functionality. You can create and edit your content using a WYSIWYG or HTML editor and it would display the content as per the CSS you have used.

    With a traditional CMS, your entire site is encompassed by one software. The frontend and the backend are closely connected through it, hence, it is also referred to as a Coupled CMS. 

    Decoupled 

    Unlike its traditional counterpart, the decoupled CMS separated the frontend from the backend. This means they work independent of each other and a change in the presentation layer does not necessarily affect the backend repository. Through decoupling, you get the features of more than one software to base your site’s architecture on. 

    Headless 

    A headless CMS is more or less similar to a decoupled one. When you take up a headless CMS, your content would always remain the same, however, each of your clients, be it an app, a device, or a browser, would be obligated for the presentation of the content. 

    The code in this instance is not in the CMS, rather it is an API that is used for communication and data sharing amongst the two software. This way developers can consume content through an API and content authors can start adding content at the same time. If you are looking for the ‘one size fits all’ approach, this is where you will find your answer. 

    Based on cost and ownership 

    Open source 

    Open source CMSs are the ones that are free of cost, at least initially. You do not need to pay for any license fee for its installation; however, there can be costs that you may incur for add-on templates and more such features. 

    Open Source CMSs are pretty popular today, the reason being their thriving community of developers. This results in the veterans redistributing and modifying the code, which not only leads to perpetual software improvements, but also helps the newbies in making progress. 

    Proprietary 

    A proprietary CMS is the exact opposite of an open source CMS, meaning it is commercial and mandates a licensing fee along with annual or monthly payments. In return for the payments, you would get an out-of-the-box system to meet all your companies requirements, continuous support and built-in functionality.

    Based on the location 

    On premises 

    As the name suggests, this is a CMS that has a physical presence within the company’s premises. The high degree of control it offers to its users is the reason for its popularity. However, the humongous investment and the chances of human error dampen its potential. 

    Cloud-based 

    The name gives it away. Such a CMS is hosted on the cloud and delivered through the web. It is essentially the combination of web hosting, web software components and technical support. It provides fast implementation and deployment along with accessibility from across the globe on any device.

    Why choose a CMS? 

    Moving further, let’s now delve into the multitudinal features that are packed inside a CMS making it a suitable choice for you and your organisation’s virtual needs.

    If I had to broadly categorise all the features of a CMS, I would end up with three major categories, which will sum up the true potential of this software. 

    Content and its production needs

    Producing content is the primary reason anyone takes on a CMS. It is true if you are a blogger and it is also true if you work for an educational institution and its online persona. It is the content that speaks for itself, when it comes to your site and it needs to be pristine, for lack of a better word. And CMSs help you achieve a level of control over your content production that you desire.

    • Starting with the edits, the WYSIWYG editor could be deemed as the heart and soul of a CMS. It provides you formatted text in paragraphs with quotes, superscripts, underlines as well as images and videos. Your authors would not have to work around codes for sure. 
    • Focusing on the media, images are an important part of it. Every CMS has room for them, they can be uploaded directly from your computer or archives, either within the content or you can add them in the page itself. The same is true for pdfs, animations and videos. Videos also have the option of being embedded through Youtube. 
    • Furthermore, CMSs also support multilingual and multi-channel sites. This eases the pressure off of the content authors and makes localised projects easy to run. 
    Content and its presentation needs

    Presentation is all about design, how it is done and how it would be showcased to the end user. There are a lot of design considerations that a CMS can help you with. 

    • A CMS would have you sorted with the right font and its size and the right colours and contrast. 
    • A CMS would have your sorted with the right responsiveness for your site. 
    • A CMS would have you sorted with the right URLs and URL logic. 
    • A CMS would have you sorted with the right templating tools to change your layout. 
    • A CMS would have you sorted with the right hierarchy for your site as well as provide the right prominence to the aspects that need it. 
    • Finally, a CMS would have your site sorted for all the right accessibility protocols to make it universally accessible. 
    Content and its distribution needs

    Once the content is produced, its distribution comes into play. This has a direct impact on your site's visibility. And CMSs ensure that you get the most out of it. 

    • The foremost part of distribution needs is metadata. This helps in tagging, categorising and describing your content. It includes everything from keyword insertion to identifying the distribution channels and placing access restrictions on the content. 
    • Secondly, CMSs also come equipped with automated marketing tools like analytics and A/B testing that help you understand user behaviour and help you capitalise it. You would just have to define the parameters and the automation would do the rest, be it publishing on your site or email marketing. 
    Content and its management needs

    Then comes the management of your content, it is a perpetual process that helps in providing an ease to the editors and developers that streamlines the builds and updates of a website. 

    • For one, a CMS helps you plan and execute the publishing of your content. You can actually schedule when and what to post and where to post it. You can also decide when something would be available for the audience to see and when it won’t be like an events’ post. Once the event has happened, it won't need to be on your site anymore and a CMS helps with that. 
    • CMSs also help you to figure out user roles and implement them. This helps in ensuring that sensitive information is only accessible to the users who have the clearance. A manager and a director are going to have different roles, so does a premium member and a regular member of your site. 
    • Finally CMS helps you in avoiding instances where you delete something important and its recovery becomes impossible. Version control and revisions are a feature that has to be in your CMS, if you want the powers to bring back the lost content. 

    Apart from these main categories, CMSs are also renowned for their security, their scalability and user friendliness. There is one more thing to add and that is the fact that a CMS can go above and beyond it capabilities by integrating itself to third-parties and combining their features with its own, a headless CMS is an example of the same. Drupal is one of the most popular CMSs, when it comes to going headless. Read our blog, Decoupled Drupal Architecture to know more about it.

    Understanding a new vogue: Static Site Generators 

    Before understanding a static site generator, let’s shed some light on static sites, since these are what it builds. A static site is the one that is designed in a way that it remains static, fixed and constant, during its design, its storage on a server and even upon its delivery to the user’s web browser. This is the attribute that differs it from a dynamic, it never changes, from the developers desktop to the end user’s, it remains as-is.

    Coming to Static Site Generators or SSG, in the most basic of terms they apply data and content to templates and create a view of a webpage. This view is then shown to end users of a site. 

    Now let’s get a little technical, you know that an SSG will only create static sites, it does so by creating a series of HTML pages that get deployed to an HTTP server. There would only be files and folders, which points to no database and no server-side rendering.

    Developers using an SSG, create a static site and deploy it to the server, so when a user requests a page, all the server has to do is find the matching file and route it towards the user. 

    If I talk about the difference between an SSG and a conventional web application stack or a CMS, I would say that it is in the view of webpages. While an SSG keeps all the views possibly needed for a site at hand well in advance, a traditional stack waits until a page has been requested and then generates the view.

    Why did SSG come along?

    Static Site Generators act differently than a CMS, they are more aligned with the needs of static sites. However, their emergence has a bigger story to tell. 

    Yes, CMSs are quite popular today, yet there is a drawback to that. With the rising acclaim of CMSs, some of them have become more prone to cyberattacks. The lead in security hacks goes to WordPress, with almost 90% of all hacks being experienced by it as reported by ITPRO reports of 2020. But, Drupal is considered the most secure CMS as can be seen in Sucuri’s 2019 Website Threat Research Report.

    Then there is the issue of performance. CMS sites operate mainly upon their servers, meaning they do the heavy-lifting. If a request is sent, it would mean the server taking the charge of the page assembly from templates and content every time. This also means that for every user visiting your site, the PHP code would have to be run to start up, communicate with the database, create an HTTP response based on the recovered data, send it to the server and then finally, an HTML file is returned to the user’s browser to display the content after interpretation. All of this may impede the performance of the site built on CMS when compared to the one powered by a static site generator. But, it’s not like CMSes give you low-performance websites. They do have provisions for delivering high performance websites. It depends upon which CMS you go with. If web performance is your concern, Drupal can be your go-to option.

    An SSG is a solution to these two conundrums, hence, it emerged with a bang. 

    What can a Static Site Generator do for you?

    Static Site Generators solve a lot of the issues that a CMS cannot, consequently they can provide you a lot for your site’s well-being. 

    SSG means better security 

    In an SSG, the need for a server is non-existent and this is the reason it provides more security. As we have already established that an SSG is rendered well in advance and its ready-to-serve infrastructure helps remove any malicious intent upon your site. This infrastructure essentially eliminates the need for servers, they do not need to perform any logic or work. 

    Apart from this, with SSG, you would not need to access databases, execute logical operations or alter resources for each independent view. As a result, there is an easy hosting infrastructure as well as an enhanced security because of the lack of physical servers required for fulfilling requests. 

    SSG means elevated performance 

    A website’s performance is concerned with its speed and request time, and SSG provides in this area as well. Whenever a page is requested, it involves a whole bunch of mechanism to get it displayed for the visitors. There is the distance it has to cover, the systems it has to interact with along with the work that those systems do. All of these take up time, shadowing your performance. 

    Since an SSG site does not mandate such a lengthy iteration per visitor request, it reduces the travel time. This is done through delivering the work directly from a CDN, a distributed network of caches, which aids in avoiding system interaction. Resultantly, your performance soars 

    SSG means higher scalability 

    When an SSG builds a site, it is often considered pre-built. I mean that is what building all the views in advance of an actual request could be defined as, right? So, with a pre-built site, you have less work on your hands. For instance, a spike in traffic would not mandate you to add in more computing power to handle each additional request, since you have already done all the work beforehand. You would also be able to cache everything in the CDN and serve it directly to the user. As a result, SSG sites offer scalability by default. 

    When should you choose a Static site generator?

    Now that you know how an SSG can benefit you, it is time to understand the scenarios that would mandate taking up a static site generator and all its advantages. 

    When building complex site is the goal 

    If you want your website to deliver more complexity, in terms of the kind of features it provides, SSG becomes a good choice. There are many that come equipped to provide you client-side features that are ready to go. 

    When creating and displaying content is the only goal

    Here SSG is a suitable choice because it would generate pages and URLs for you. And these pages would give you a 100% control over what is being displayed, meaning the output would always be in your hands; content pages need that. 

    When generating numerous pages is the goal 

    A static site generator can create pages at a great speed. It might not be seconds, but it is quite fast. So, when creating websites that would need a lot of pages to be created, SSG’s speed comes in quite handy. 

    When templating needs are complex as well 

    An SSG is a powerful software, it has the ability to assess your site’s visual style and content along with its behaviour and functionality. This feature becomes fruitful, when building a website with diverse templating needs. Vue and React based SSGs would definitely help you get the versatility you need on your website, along with the standard use of concept of code reuse on your site. 

    I would like to add just one more thing, and that is the fact that your team must be familiar with the static site generator that you are going to end up using. There are a lot in the market. If your team is familiar with .net, use and SSG powered with it. On the other hand if it finds JavaScript more familiar territory, go with an SSG based on that. Let your development team be a part of the discussion, when the suitability of a static site generator is being decided. 

    Are Static Site Generators always the right option? 

    Coming from the suitability, you would think that an SSG is a great choice. Don’t get me wrong, it is. However, it isn’t a universal software. There are instances when it may not be the right choice. So, let’s delve into these scenarios.

    Not when you do not have development experience 

    Static Site Generators become a tad bit difficult for amateur developers. Your developers ought to have experience to reap all its benefits. The building process is considered to be more difficult than that of a CMS, something that finding plugins for pre-built pages acn become a chore. Furthermore, there isn’t a huge community out there to help you in the development part, if you are a beginner. 

    Not when you need a site built urgently 

    You have to understand the urgency and SSGs are not the best of friends. From learning the build process to developing the template code, everything needs time. 

    There are development scripts to be me made;
    There is the complication of customised them;
    There is the additional process of creating and setting Markdown files;

    All of these account to more time requirements for the development process. Think of it like this, you are going to be doing all the grunt work beforehand, and that would necessitate more time. 

    Not when you need server-side functionality 

    When partnering with an SSG, you would be parting with some, if not many, interactive functions on your site. For instance, user logins would be difficult to create, so would web forms and discussion forums. However, there are certain options like lunr.js search and Disqus commenting to help you with your sites interactivity. I would say that these options are pretty limited.

    Not when your site has to have hundreds of pages

    You might think that I am contradicting myself, however, I am not. Static site generators can create a website with a thousand pages, yet the process can become tedious and awkward. For a thousand or so pages, the content editing and publishing would be cumbersome. Along with this real-time updates could get delayed and like I mentioned before build times rise consequently.

    Not when website consistency is a priority 

    Lastly, SSG sites offer a lot of flexibility. That should be a good thing, however, it does have a side effect and that is on your site’s consistency. This is because anything that is found in the Markdown files can be rendered as page content. Consequently, users get the chance to include scripts, widgets and other undesired items. 

    Can a CMS and an SSG work together? 

    Yes, a CMS and an SSG can work together and pretty efficiently at that. However, that partnership is only possible in a headless CMS. This is because a headless CMS gives room for other frontend technology to come and play and in this case that technology is found in static site generators. 

    A headless CMS is pretty versatile, choosing a static site to go as its head could help you get most of the benefits that both, the static site and headless CMS, come along with. This partnership indeed has a lot to offer. Let’s find out what that is. 

    Proffers easy deployment via APIs

    SSGs are quite straightforward to use, especially with an API, which is the connecting force between the SSG and the CMS. Pulling data from an API for generating and deploying a static PWA to any web host or Content Delivery Network is a breeze. 

    Proffers ease to the marketing team 

    When you work only with an SSG, you would face difficulties as it puts a lot of boundations on the marketing team. This isn’t a problem when you partner with a CMS. 

    Proffers easy editing and workflow 

    Conventionally, SSGs do not have a WYSIWYG editor or workflow capabilities for the tracking and collaboration of content. You might think that it is only needed for dynamic sites, but that isn’t the case. Static sites also need that. Since CMSs have that capability, they become ideal for content before actually running the SSG; the perfect contextual partnership. 

    Proffers easy updates to sites 

    With a CMS, you can easily change and update the content. With an SSG, the same changes can be pulled up through the APIs and a new static site can be generated every time they are incurred. All the developers have to do is set a tool up for content pulling and generation. As a result, your site would always be up-to-date and the users would not need to be processed whenever they visit your site. 

    To check out some examples of how CMS and SSG can come together, read how Drupal and Gatsby can be leveraged for developing blazing fast websites. You can also go through the benefits of going ultra-minimalistic with the combination of Metalsmith and Drupal.

    Conclusion 

    In the end, all I want to say is that both a CMS and an SSG have their own set of features and capabilities that make them excellent at what they do, making their users more than happy. However, when it comes to getting the best out of both of them, there is only one kind of CMS that can help you reap the benefits of this dynamic. It is up to you to decide whether you want to use them together or individually.  
     

    blog banner blog image CMS Content Management System Static Site Generators Drupal Gatsby Metalsmith Blog Type Articles Is it a good read ? On

    Promet Source: What is Human-Centered Web Design?

    3 days 21 hours ago
    Human-centered design is a concept that gained traction in the 1990s as an approach  to developing innovative solutions based on a laser-sharp focus on human needs and human perspectives during every phase of a design or problem-solving process. Building upon the principles of human-centered design, Promet Source has served as a pioneer and leading practitioner human-centered web design. 

    Golems GABB: Revealing the secrets of decoupled Drupal Commerce

    4 days 4 hours ago
    Revealing the secrets of decoupled Drupal Commerce Editor Sun, 01/17/2021 - 21:08 Decoupled Drupal Commerce secrets revealed

    There is a technology that allows developers to upscale and speed up e-commerce sites and take their customers’ shopping experiences to a whole new level. It’s called decoupled Drupal Commerce. The Drupal development world is buzzing with discussions of this hot trend. Of course, the Golems Drupal team is happy to join in. Dear readers, our tour of decoupled Drupal e-commerce begins.

    What is decoupled Drupal Commerce?

    Decoupled Drupal Commerce is an architecture where your online store backend, or data hub of your e-commerce shop is separated from the user interface, or customer experience layer. Another frequently used term is headless Drupal Commerce.

    Drupal In the News: On Its 20th Birthday, Drupal Poised To Capture The Next Generation Of The Digital Experience Market

    6 days 8 hours ago

    20 years ago, the Drupal project started in a dorm room—today it is a billion dollar industry.

    PORTLAND, Ore., U.S.A and LOCAL AREA HERE, January 15, 2021—Drupal, the world’s leading open source digital experience platform (DXP), celebrates 20 years of community-driven innovation. Since its founding 20 years ago, Drupal has touched millions of lives. One in 30 sites on the web is powered by Drupal, and that means most users of the web have experienced Drupal—even if they don't know it. 

    Drupal has pioneered the evolution of content delivery across multiple channels. Whether powering conversational user interfaces (CUI) for smart devices, pushing content to digital signage for New York Metropolitan Transportation Authority (MTA), or serving as the core content store for augmented reality experiences, Drupal’s sophisticated architecture and platform stand ready for the future of digital content. 

    Redefining digital experiences

    7 years ago—on the eve of Drupal's birthday—Drupal founder and project lead, Dries Buytaert, laid out his belief that the web was entering a new era. 

    Mobile had transformed the web, but I believed this was just the beginning. The mobile web was the first example of a new web defined by digital experiences that conform to a user's context and devices,” says Buytaert. “Since then, Drupal has defined itself as the leading platform for ambitious digital experiences, and as channels and devices proliferate, Drupal will continue to lead the open source DXP market.

    Powered by a global community of innovation

    As part of this 20 year milestone, we celebrate our community of more than 100,000 contributors who made Drupal what it is today,” says Heather Rocker, executive director of the Drupal Association. “Success at this scale is possible because the Drupal community exemplifies the values of open source and proves that innovation is sustained by healthy communities. One of our key goals at the Drupal Association is to convene the resources necessary for continued project innovation, and we do that through the collaboration of a global community that continues to grow year after year.

    In fact, Drupal contribution increased by 13% in a year when many industries contracted due to the COVID-19 pandemic. Drupal's open source model has built a robust and thriving ecosystem of individual contributors, professional service providers, and end-user organizations that is well positioned to capitalize on the next 20 years of digital innovation. 

    Drupal continues to evolve, serving needs around the globe and expanding into new markets.  Future-looking priorities include a continued positive impact on the Open Web, the cultivation of a diverse and inclusive open source community, and an increased focus on editorial experience and usability—to make the power of the Drupal digital experience platform even more accessible.  

    2021 will be marked with year-long celebrations happening around the world with particular focus at DrupalCon in April. Related 20th birthday events can be found on social media through the hashtag #CelebrateDrupal and at CelebrateDrupal.org.  

    About Drupal and the Drupal Association

    Drupal is the open source digital experience platform utilized by millions of people and organizations around the world, made possible by a community of 100,000-plus contributors and enabling more than 1.3 million users on Drupal.org. The Drupal Association is the non-profit organization focused on accelerating Drupal, fostering the growth of the Drupal community, and supporting the Project’s vision to create a safe, secure, and open web for everyone.

     
    ###
     
    For more information or interview requests contact Heather Rocker,  heather@association.drupal.org
     

    Dries Buytaert: Drupal celebrates 20 years!

    6 days 9 hours ago

    On January 15, 2001, exactly 20 years ago, I released Drupal 1.0.0 into the world. I was a 22 years old, and just finished college. At the time, I had no idea that Drupal would someday power 1 in 35 websites, and impact so many people globally.

    As with anything, there are things Drupal did right, and things we could have done differently. I recently spoke about this in my DrupalCon Europe 2020 keynote, but I'll summarize some thoughts here.

    Why I'm still working on Drupal after 20 years Me, twenty years ago, in the dorm room where I started Drupal. I'd work on Drupal sitting in that chair.

    I started Drupal to build something for myself. As Drupal grew, my "why", or reasons for working on Drupal, evolved. I began to care more about its impact on end users and even non-users of Drupal. Today, I care about everyone on the Open Web.

    Optimizing for impact means creating software that works for everyone. In recent years, our community has prioritized accessibility for users with disabilities, and features like lazy loading of images that help users with slower internet connections. Drupal's priority is to continue to foster diversity and inclusion within our community so all voices are represented in building an Open Web.

    Three birthday wishes for Drupal Me in 2004, giving my first ever Drupal presentation, wearing my first ever Drupal t-shirt.

    Drupal's 20th birthday got me thinking about things I'm hoping for in the future. Here are a few of those birthday wishes.

    Birthday wish 1: Never stop evolving

    Only 7% of the world's population had internet access when I released Drupal 1 in 2001. Smartphones or the mobile web didn't exist. Many of the largest and most prominent internet companies were either startups (e.g. Google) or had not launched yet (e.g. Facebook, Twitter).

    A list of technology events that came after Drupal, and that directly or indirectly impacted Drupal. To stay relevant, Drupal had to adjust to many of them.

    Why has Drupal stayed relevant and thrived all these years?

    First and foremost, we've been focused on a problem that existed 20 years ago, exists today, and will exist 20 years from now: people and organizations need to manage content. Working on a long-lasting problem certainly helps you stay relevant.

    Second, we made Drupal easy to adopt (which is inherent to Open Source), and kept up with the ebbs and flows of technology trends (e.g. the mobile web, being API-first, supporting multiple channels of interaction, etc).

    The great thing about Drupal is that we will never stop evolving and innovating.

    Birthday wish 2: Continue our growing focus on ease-of-use

    For the longest time I was focused on the technical purity of Drupal and neglected its user experience. My focus attracted more like-minded people. This resulted in Drupal's developer-heavy user experience, and poor usability for less technical people, such as content authors.

    I wish I had spent more time thinking about the less technical end user from the start. Today, we've made the transition, and are much more focused on Drupal's ease-of-use, out-of-the-box experience, and more. We will continue to focus on this.

    Birthday wish 3: Economic systems to sustain and scale Open Source

    In the early years of the Open Source movement, commercial involvement was often frowned upon, or even banned. Today it's easy to see the positive impacts of sponsored contributions on Drupal's growth: two-thirds of all contributions come from Drupal's roughly 1,200 commercial contributors.

    I believe we need to do more than just accept commercial involvement. We need to embrace it, encourage it, and promote it. As I've discussed before, we need to reward Makers to maximize contributions to Drupal. No Open Source community, Drupal included, does this really well today.

    Why is that important?

    In many ways, Open Source has won. Open Source provides better quality software, at a lower cost, without vendor lock-in. Drupal has helped Open Source win.

    That said, scaling and sustaining Open Source projects remains hard. If we want to create Open Source projects that thrive for decades to come, we need to create economic systems that support the creation, growth and sustainability of Open Source projects.

    The alternative is that we are stuck in the world we live in today, where proprietary software dominates most facets of our lives.

    In another decade, I predict Drupal's incentive models for Makers will be a world-class example of Open Source sustainability. We will help figure out how to make Open Source more sustainable, more fair, more egalitarian, and more cooperative. And in doing so, Drupal will help remove the last hurdle that prevents Open Source from taking over the world.

    Thank you A group photo taken at DrupalCon Seattle in 2019.

    Drupal wouldn't be where it is today without the Drupal community. The community and its growth continues to energize and inspire me. I'd like to thank everyone who helped improve and build Drupal over the past two decades. I continue to learn from you all. Happy 20th birthday Drupal!

    1xINTERNET blog: Celebrating twenty years of Drupal

    6 days 12 hours ago
    During the year 2021 we will be focusing on highlighting our work at 1xINTERNET and our solutions made with Drupal. We call this series “Celebrate 20 years of Drupal” where we will highlight 20 projects through the year that we are involved in and where Drupal has been used.
    Checked
    7 hours 45 minutes ago
    Drupal.org - aggregated feeds in category Planet Drupal
    Subscribe to Drupal Planet feed
    Category
  • Dries

    Drupal celebrates 20 years!

    6 days 9 hours ago

    On January 15, 2001, exactly 20 years ago, I released Drupal 1.0.0 into the world. I was 22 years old, and just finished college. At the time, I had no idea that Drupal would someday power 1 in 35 websites, and impact so many people globally.

    As with anything, there are things Drupal did right, and things we could have done differently. I recently spoke about this in my DrupalCon Europe 2020 keynote, but I'll summarize some thoughts here.

    Why I'm still working on Drupal after 20 years Me, twenty years ago, in the dorm room where I started Drupal. I'd work on Drupal sitting in that chair.

    I started Drupal to build something for myself. As Drupal grew, my "why", or reasons for working on Drupal, evolved. I began to care more about its impact on end users and even non-users of Drupal. Today, I care about everyone on the Open Web.

    Optimizing for impact means creating software that works for everyone. In recent years, our community has prioritized accessibility for users with disabilities, and features like lazy loading of images that help users with slower internet connections. Drupal's priority is to continue to foster diversity and inclusion within our community so all voices are represented in building an Open Web.

    Three birthday wishes for Drupal Me in 2004, giving my first ever Drupal presentation, wearing my first ever Drupal t-shirt.

    Drupal's 20th birthday got me thinking about things I'm hoping for in the future. Here are a few of those birthday wishes.

    Birthday wish 1: Never stop evolving

    Only 7% of the world's population had internet access when I released Drupal 1 in 2001. Smartphones or the mobile web didn't exist. Many of the largest and most prominent internet companies were either startups (e.g. Google) or had not launched yet (e.g. Facebook, Twitter).

    A list of technology events that came after Drupal, and that directly or indirectly impacted Drupal. To stay relevant, Drupal had to adjust to many of them.

    Why has Drupal stayed relevant and thrived all these years?

    First and foremost, we've been focused on a problem that existed 20 years ago, exists today, and will exist 20 years from now: people and organizations need to manage content and participate on the web. Working on a long-lasting problem certainly helps you stay relevant.

    Second, we made Drupal easy to adopt (which is inherent to Open Source), and kept up with the ebbs and flows of technology trends (e.g. the mobile web, being API-first, supporting multiple channels of interaction, etc).

    The great thing about Drupal is that we will never stop evolving and innovating.

    Birthday wish 2: Continue our growing focus on ease-of-use

    For the longest time I was focused on the technical purity of Drupal and neglected its user experience. My focus attracted more like-minded people. This resulted in Drupal's developer-heavy user experience, and poor usability for less technical people, such as content authors.

    I wish I had spent more time thinking about the less technical end user from the start. Today, we've made the transition, and are much more focused on Drupal's ease-of-use, out-of-the-box experience, and more. We will continue to focus on this.

    Birthday wish 3: Economic systems to sustain and scale Open Source

    In the early years of the Open Source movement, commercial involvement was often frowned upon, or even banned. Today it's easy to see the positive impacts of sponsored contributions on Drupal's growth: two-thirds of all contributions come from Drupal's roughly 1,200 commercial contributors.

    I believe we need to do more than just accept commercial involvement. We need to embrace it, encourage it, and promote it. As I've discussed before, we need to reward Makers to maximize contributions to Drupal. No Open Source community, Drupal included, does this really well today.

    Why is that important?

    In many ways, Open Source has won. Open Source provides better quality software, at a lower cost, without vendor lock-in. Drupal has helped Open Source win.

    That said, scaling and sustaining Open Source projects remains hard. If we want to create Open Source projects that thrive for decades to come, we need to create economic systems that support the creation, growth and sustainability of Open Source projects.

    The alternative is that we are stuck in the world we live in today, where proprietary software dominates most facets of our lives.

    In another decade, I predict Drupal's incentive models for Makers will be a world-class example of Open Source sustainability. We will help figure out how to make Open Source more sustainable, more fair, more egalitarian, and more cooperative. And in doing so, Drupal will help remove the last hurdle that prevents Open Source from taking over the world.

    Thank you A group photo taken at DrupalCon Seattle in 2019.

    Drupal wouldn't be where it is today without the Drupal community. The community and its growth continues to energize and inspire me. I'd like to thank everyone who helped improve and build Drupal over the past two decades. I continue to learn from you all. Happy 20th birthday Drupal!

    Dries

    Acquia retrospective 2020

    1 week 3 days ago

    At the beginning of every year, I like to publish a retrospective to look back and take stock of how far Acquia has come over the past 12 months. I take the time to write these retrospectives because I want to keep a record of the changes we've gone through as a company. It also helps me track my thinking and personal growth year over year.

    If you'd like to read my previous retrospectives, you can find them here: 2019, 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009. This year marks the publishing of my twelfth retrospective. When read together, these posts provide a comprehensive overview of Acquia's trajectory.

    COVID-19

    2020 was a strange year. Since March 2020, I've only been in an Acquia office twice. Despite the changes the pandemic brought upon us all, Acquia did well in 2020. We continued our unbroken, 14-year growth streak.

    We were proud to help many of our customers cope with COVID-19. A few notable examples:

    • The State of New York spun up COVID-19 sites in just three days using Acquia Site Factory. We supported 49 million service interactions and 342 million page views across 60 million users at the onset of the pandemic.
    • King Arthur Baking Company attracted a new digital audience of pandemic bakers that led to a 200% year-over-year growth in e-commerce sales.
    • When stores first closed, Godiva used Acquia Customer Data Platform (CDP) to double email-open rates and nearly triple click-through rates. Their website traffic was up 63%.
    Execution on product vision

    Open Source, digital transformation, cloud computing and data-driven marketing are powerful tailwinds transforming how every company does business. 2020 only accelerated these trends, which Acquia benefited from.

    While COVID-19 changed some of Acquia's 2020 plans, our vision and strategy didn't waver. We have a clear vision for how to redefine Digital Experience Platforms. For years now, we've been patient investors and builders towards that vision.

    Throughout 2020 we continued to invest decisively in our Web Content Management solutions while accelerating our move into the broader Digital Experience Platform market. "Decisively", because we grew our Marketing Cloud engineering team by 65% and our Drupal Cloud engineering team by 35% — a testament to the strength of our company amidst a pandemic.

    The best way to learn about what Acquia has been working on is to watch the recording of my Acquia Engage keynote. The presentation is only two months old, and in 30 minutes I cover 10 major product updates. Rather than repeat them here, take a look at the video.

    Our product portfolio is organized in two lines of business: the Drupal Cloud and Marketing Could. Content is at the core of the Drupal Cloud, and data is at the core of the Marketing Cloud.

    Acquia's product portfolio exiting 2020.

    Acquia's Drupal Cloud remains the number one Drupal platform in the enterprise with 40% of Fortune 100 companies as our customers. Acquia is the number one vendor in terms of performance, scalability, security and compliance. Across our customers, we served more than 500 billion HTTP requests in 2020.

    Acquia's Marketing Cloud managed over 3 billion customer profiles, 20 billion e-commerce transactions, and over 100 billion customer interactions in 2020. We made nearly 1.5 billion machine learning predictions every day.

    This year, we saw much larger spikes than normal on Black Friday and Cyber Monday. These huge spikes are not surprising given much store traffic turned to digital. The autoscaling of the platform helped us handle these spikes without hiccups.

    As of November 2020, Acquia moved completely off of Marketo to Acquia Campaign Studio (based on the Open Source Mautic). While this move probably won't come as a surprise, it is an important milestone for us. Marketo was a critical part of Acquia's marketing operations so we're excited to practice what we preach.

    At the beginning of 2018, shortly after Mike Sullivan joined Acquia as our CEO, we set a goal to become a leader in Digital Experience Platforms within three years. We did it in two years. 2020 was the first time Acquia was recognized as a leader in the Gartner MQ for Digital Experience Platforms. We're now among leaders like Salesforce and Adobe, which I consider some of the very best software companies in the world.

    The success that Acquia earned during 2020 was not just driven by our strategy and roadmap execution. They are also the result of our unique culture and how we continued to support our teams throughout the pandemic. We were recognized in Great Places to Work UK and Excellence in Wellbeing; Great Places to Work and Top IT Companies India; and were named a Boston Globe Top Place to Work.

    Contributing to Open Source

    Drupal did well in 2020. After five years of work, Drupal 9 was released, bringing many important improvements. In 2020, Drupal received contributions from more than 8,000 different individuals and more than 1,200 different organizations. The total number of contributions to Drupal increased year over year. For example, the Drupal community worked on 4,195 different Drupal.org projects in 2020 — a large 20% year-over-year increase compared to 2019.

    Acquia remained the the top commercial contributor to Drupal in 2020:

    The contribution gap between Acquia and other PaaS and hosting companies is very large. According to Drupal.org's credit system, Acquia contributes 15x more than Pantheon and 80x more than Platform.sh.

    One specific contribution that I'm extra proud of is that Acquia bought advertising on Drupal.org, and used said advertising to highlight the 10 Acquia partners who contribute back to Drupal the most. It's always important to promote the organizations that contribute to Drupal, but in an economic downturn, it's even more important.

    The Acquia-paid banner that was on Drupal.org for most of 2020. It promotes Third and Gove, Acro Media, Mediacurrent, QED42, CI&T, FFW, Palantir.net, Lullabot, Four Kitchens, Phase2 and Srijan.

    We also contributed to Mautic: we helped evolve Mautic's governance, release Mautic 3, and organize the first ever MautiCon. The Mautic project made 13 releases over the past year compared to only 3 releases in 2019 before Acquia acquired Mautic. We're also seeing a steady growth in community members and active contributors.

    A year of personal growth

    Acquia started off 2020 as a new member of the Vista portfolio. I've been learning a lot from working with Vista and implementing the best practices Vista is famous for. From a personal growth perspective, I wouldn't trade the experience for anything.

    At the end of 2019, Acquia lost a great leader; Mike Aeschliman. Mike was our Head of Engineering. Mike and I were joined at the hip. Because of Mike's passing, I stepped in to help run Engineering, as well as Product, for the first three months of 2020. In March, John Mandel joined us to fill Mike's shoes. John has been a great leader and partner. Throughout 2020, Mike remained in my thoughts, especially when we achieved milestones that he and I were working towards. I believe he'd be proud of our progress.

    Early in 2020, I organized my team into 4 groups: Drupal Cloud, Marketing Cloud, Product Marketing, and User Experience. I spent the year scaling and operationalizing the R&D organization: hiring, setting and tracking goals, managing 10+ product initiatives, evolving our pricing and packaging, reviewing business cases, improving our messaging and positioning, and more. I spent as much time focused on managing the top line as managing our bottom line — proof that Acquia is no longer a startup. It was a dense year; 10 to 16 meetings a day, 5 days a week. Every day was packed, but Acquia is better for it.

    I did quite a few virtual speaking engagements in 2020, including my keynotes at DrupalCon Global, DrupalCon Europe and Web Summit. With COVID-19, it was uncertain if DrupalCon would happen, but I'm glad it still did. Virtual events are not the same as in-person events though; I miss the travel experience, direct attendee feedback and personal interactions. In-person events give me energy; virtual events don't.

    Thank you

    As I mentioned at the beginning of this post, 2020 was a strange year. On days like today, when looking back at the past year, I am reminded of how lucky I am. I'm fortunate to be working at a healthy and growing company during these uncertain times. I hope that 2021 brings good health, predictability and stability for everyone.

    Dries

    Can someone add some more HTML tags, please?

    2 weeks 1 day ago

    Every day, millions of new web pages are added to the internet. Most of them are unstructured, uncategorized, and nearly impossible for software to understand. It irks me.

    Look no further than Sir Tim Berners-Lee's Wikipedia page:

    What Wikipedia editors write (source).What visitors of Wikipedia see.

    At first glance, there is no rhyme or reason to Wikipedia's markup. (Wikipedia also has custom markup for hieroglyphs, which admittedly is pretty cool.)

    The problem? Wikipedia is the world's largest source of knowledge. It's a top 10 website in the world. Yet, Wikipedia's markup language is nearly impossible to parse, Tim Berners-Lee's Wikipedia page has almost 100 HTML validation errors, and the page's generated HTML output is not very semantic. It's hard to use or re-use with other software.

    I bet it irks Sir Tim Berners-Lee too.

    What Wikipedia editors write (source).What the browser sees; the HTML code Wikipedia (MediaWiki) generates.

    It's not just Wikipedia. Every site is still messing around with custom s for a table of contents, footnotes, logos, and more. I could think of a dozen new HTML tags that would make web pages, including Wikipedia, easier to write and reuse: , , , and many more.

    A good approach would be to take the most successful Schema.org schemas, Microformats and Web Components, and incorporate their functionality into the official HTML specification.

    Adding new semantic markup options to the HTML specification is the surest way to improve the semantic web, improve content reuse, and advance content authoring tools.

    Unfortunately, I don't see new tags being introduced. I don't see experiments with Web Components being promoted to official standards. I hope I'm wrong! (Cunningham's Law states that the best way to get the right answer on the internet is not to ask a question; it's to post the wrong answer. If I'm wrong, I'll update this post.)

    If you want to help make the web better, you could literally start with Sir Tim Berners-Lee's Wikipedia page, and use it as the basis to spend a decade pushing for HTML markup improvements. It could be the start of a long and successful career.

    Dries

    The long-term, world-changing promise of the blockchain

    2 weeks 3 days ago

    I enjoyed reading Vitalik's 2020 endnotes. Vitalik is one of the founders of Ethereum, and one of the most interesting people in the world to follow right now.

    Like Vitalik, I'm interested in economic systems, multi-stakeholder coordination, and public good governance and sustainability. How do we create Open Source communities that will thrive for hundreds of years to come? How do we make sure the Open Web is still thriving in a thousand years? These are some of the questions I think about.

    While I think about these things, Vitalik is making it happen. The blockchain world is experimenting with new funding models (e.g. ICOs, DAICOs or quadratic funding), decision-making models (e.g. quadratic voting), organizational models (e.g. DAOs), and architectural innovation (e.g. Filecoin or Blockstack).

    The blockchain allows these concepts to be implemented in a robust and secure way. Eventually, they could be used to help sustain and govern public goods like Open Source projects and the Open Web.

    But it's not just Open Source or the Open Web that should be considered. Some of the very biggest problems in the world (for example, climate change) are multi-stakeholder problems that require better funding, coordination and decision-making models too.

    So, yes, these are important developments to pay attention to!

    Dries
    Checked
    7 hours 45 minutes ago
    Subscribe to Dries feed
    Category