Planet Drupal

Syndicate content
Drupal.org - aggregated feeds in category Planet Drupal
Updated: 8 hours 44 min ago

Valuebound: Step-by-step guide to Drush & Drush Aliases to make sure your web application has quick releases

Tue, 10/03/2017 - 10:33

Have you ever thought that your business needs to make sure that your web application has a quick release in order to sustain the long race. This sort of has become easy to manage by continuous development & continuous integration using Drush & Drush Aliases. Drupal web development is one such place where multiple command-line interface (CLI) tools are available to make developer’s life easy, and among them, the two important things are Drush and Drupal Console.

In this blog, we will take a brief look at Drush & Drush Aliases and how it can make developer’s tedious manual web development tasks easy by offering various commands to…

aleksip.net: Should Facebook be trusted on React and patents?

Tue, 10/03/2017 - 09:48
Dries Buytaert, the BDFL of Drupal, has just published a blog post in which he states that “React is the right direction to move for Drupal's administrative interfaces”. A related issue has also been opened on drupal.org.

Agiledrop.com Blog: AGILEDROP: Our Drupal Blogs from September

Tue, 10/03/2017 - 09:40
It's the beginning of the new month, so it's time to look at all the Drupal blogs we have written for you in September. First, we have announced that after a long time, we will be present on any Drupal event. During holidays in July and August, we were not so active in that part, so it was right to point out that we were heading to DrupalCamp Antwerp at that time. Our Commercial director Iztok Smolic had a session there and our client adviser Ales Kohek was taking part in any of the Drupal events for the first time. But more about that later. Our second blog topic in September was again… READ MORE

Appnovation Technologies: Appnovator Spotlight: Ed Cann

Tue, 10/03/2017 - 08:50
Appnovator Spotlight: Ed Cann Here's an insight into Ed, the man with the 'Cann do' attitude... Who are you? What's your story? I'm Ed, a welshman born an bred in Swansea. I grew up fiddling with computers from an early age starting with my Commodore +4. After university I decided that Engineering wasn't for me and Web development was where my talent lies so started with a few f...

PreviousNext: DrupalCon Vienna session retro: Test all the things!

Tue, 10/03/2017 - 05:57
Share:

Last week I was fortunate enough to attend and deliver a session at DrupalCon Vienna. The session was based around leveraging and getting productive with the automated testing tools we use in the Drupal community.

by Sam Becker / 3 October 2017

For the kind of large scale projects we work on, it's essential that automated testing is a priority and firmly embedded in our technical culture. Stability and maintainability of the code we're working on helps to build trusting relationships and happy technical teams. I have for a long time been engaged with the developments of automated testing in Drupal core and internally we've worked hard to adapt these processes into the projects we build and fill-in any blanks where required.

I was fortunate enough to be selected to share this at DrupalCon Vienna. Without further ado, I present, Test all the things! Get productive with automated testing in Drupal 8:

Our current testing ethos is based around using the same tools for core and contrib for our bespoke Drupal project builds. Doing so allows us to context-switch between our own client work and contributed project or core work. To make this work we've addressed a few gaps in what's available to us out of the box.

Current State of Testing

I had some great conversations after the session with developers who were just starting to explore automated testing in Drupal. While the tools at our disposal are powerful, there is still lots of Drupal-specific knowledge required to become productive. My hope is the session helped to fill in some of the blanks in this regard.

E2E Testing

Because all of the test cases in core are isolated and individually setup environments/installations, end-to-end testing is tricky without some additional work. One of the touch points in the session was based around skipping the traditional set-up processes and using the existing test classes against pre-provisioned environments. Doing so replicates production-like environments in a test suite, which helps to provide a high-level of confidence tests are asserting behaviors of the whole system. Bringing this into core as a native capability is being discussed on drupal.org and was touched on in the session.

JS Unit Testing

One thing Drupal core has yet to address is JavaScript unit testing. For complex front-ends, testing JS application code with a browser is can become clumsy and hard to maintain. One approach we've used to address this is Jest. This nicely compliments front-ends where individual JavaScript modules can be isolated and individually tested.

Summing up, attending DrupalCon Vienna, presenting the session and meeting the members of the broader community was a great experience. I'm hopeful my session was able to contribute to the outstanding quality of sessions and technical discussions.

Tagged DrupalCon, DrupalCon Vienna, Testing

Posted by Sam Becker
Senior Developer

Dated 3 October 2017

Add new comment

Dcycle: Letsencrypt HTTPS for Drupal on Docker

Tue, 10/03/2017 - 02:00

This article is about serving your Drupal Docker container, and/or any other container, via https with a valid Let’s encrypt SSL certificate.

Step one: make sure you have a public VM

To follow along, create a new virtual machine (VM) with Docker, for example using the “Docker” distribution in the “One-click apps” section of Digital Ocean.

This will not work on localhost, because in order to use Let’s Encrypt, you need to demonstrate ownership over your domain(s) to the outside world.

In this tutorial we will serve two different sites, one simple HTML site and one Drupal site, each using standard ports, on the same Docker host, using a reverse proxy, a container which sits in front of your other containers and directs traffic.

Step two: Set up two domains or subdomains you own and point them to your server

Start by making sure you have two domains which point to your server, in this example we’ll use:

  • test-one.example.com will be a simple HTML site.
  • test-two.example.com will be a Drupal site.
Step three: create your sites

We do not want to map our containers’ ports directly to our host ports using -p 80:80 -p 443:443 because we will have more than one app using the same port (the secure 443). Port mapping will be the responsibility of the reverse proxy (more on that later). Replace example.com with your own domain:

DOMAIN=example.com docker run -d \ -e "VIRTUAL_HOST=test-one.$DOMAIN" \ -e "LETSENCRYPT_HOST=test-one.$DOMAIN" \ -e "LETSENCRYPT_EMAIL=my-email@$DOMAIN" \ --expose 80 --name test-one \ httpd docker run -d \ -e "VIRTUAL_HOST=test-two.$DOMAIN" \ -e "LETSENCRYPT_HOST=test-two.$DOMAIN" \ -e "LETSENCRYPT_EMAIL=my-email@$DOMAIN" \ --expose 80 --name test-two \ drupal

Now you have two running sites, but they’re not yet accessible to the outside world.

Step three: a reverse proxy and Let’s encrypt

The term “proxy” means something which represents something else. In our case we want to have a webserver container which represents our Drupal and html containers. The Drupal and html containers are effectively hidden in front of a proxy. Why “reverse”? The term “proxy” is already used and means that the web user is hidden from the server. If it is the web servers that are hidden (in this case Drupal or the html containers), we use the term “reverse proxy”.

Let’s encrypt is a free certificate authority which certifies that you are the owner of your domain.

We will use nginx-proxy as our reverse proxy. Because that does not take care of certificates, we will use LetsEncrypt companion container for nginx-proxy to set up and maintain Let’s Encrypt certificates.

Let’s start by creating an empty directory which will contain our certificates:

mkdir "$HOME"/certs

Now, following the instructions of the LetsEncrypt companion project, we can set up our reverse proxy:

docker run -d -p 80:80 -p 443:443 \ --name nginx-proxy \ -v "$HOME"/certs:/etc/nginx/certs:ro \ -v /etc/nginx/vhost.d \ -v /usr/share/nginx/html \ -v /var/run/docker.sock:/tmp/docker.sock:ro \ --label com.github.jrcs.letsencrypt_nginx_proxy_companion.nginx_proxy \ jwilder/nginx-proxy

And, finally, start the LetEncrypt companion:

docker run -d \ --name nginx-letsencrypt \ -v "$HOME"/certs:/etc/nginx/certs:rw \ -v /var/run/docker.sock:/var/run/docker.sock:ro \ --volumes-from nginx-proxy \ jrcs/letsencrypt-nginx-proxy-companion

Wait a few minutes for "$HOME"/certs to be populated with your certificate files, and you should now be able to access your sites:

  • https://test-two.example.com/ should show the Drupal installer (setting up a MySQL container to actually install Drupal is outside the scope of this article);
  • https://test-one.example.com should show the “It works!” page.
  • In both cases, the certificate should be valid and you should get no error message.
  • http://test-one.example.com should redirect to https://test-one.example.com
  • http://test-two.example.com should redirect to https://test-two.example.com
A note about renewals

Let’s Encrypt certificates last 3 months, so we generally want to renew every two months. LetsEncrypt companion container for nginx-proxy states that it automatically renews certificates which are set to expire in less than a month, and it checks this hourly, although there are some renewal-related issues in the issue queue.

It seems to also be possible to force renewals by running:

docker exec nginx-letsencrypt /app/force_renew

So it might be worth considering to be on the lookout for failed renewals and force them if necessary.

Enjoy!

You can now bask in the knowledge that your cooking blog will not be man-in-the-middled.

This article is about serving your Drupal Docker container, and/or any other container, via https with a valid Let’s encrypt SSL certificate.

Palantir: Drupal 8 is Great for Easy Publishing

Mon, 10/02/2017 - 21:19
Drupal 8 is Great for Easy Publishing brandt Mon, 10/02/2017 - 14:19 Alex Brandt Oct 2, 2017

The #D8isGr8 blog series will focus on why we love Drupal 8 and how it provides solutions for our clients. This post in the series comes from Alex Brandt, Marketing Lead.

In this post we will cover...
  • What changes Drupal 8 has made to the editing experience
  • How Drupal 8 promotes accessibility
  • One way we use Drupal 8 to connect with our audience

Stay connected with the latest news on web strategy, design, and development.

Sign up for our newsletter.

Oh Drupal 8, how do I love thee? Let me count the ways… As a content editor on a small team, I welcome every chance I get to publish something easier, quicker, and more effectively. My first experience publishing content in Drupal was in Drupal 7, and without having previous HTML experience, it was a time-consuming endeavor. Although there is a plethora of different reasons why I love publishing content in Drupal 8, I’ll narrow it down to my top three.

1.) WYSIWYG FTW!

This little bar is my best friend:

A quick WYSIWYG editor (CKEditor) is now standard in Drupal 8 core, which means there’s no need to look up the HTML every time I want to include a link, stylize a heading, or insert an image. The amount of time I save when publishing is awesome, but it also prevents me from using sloppy code that could become an issue later down the line if we migrate content.

2.) Keeping Things Accessible with Alt Text

Drupal 8 now flags when you need alternative text (alt text), and it doesn’t allow you to publish a post without providing these descriptions. We always strive to make our corner of the web equally accessible for all users, and this is a safeguard to make sure we continue doing so. You can read more about why alt text is important in our recent post on accessibility.

This red asterisk prompt displays every time you insert an image.3.) Customization

Just like most institutions, our website is one of the most important marketing tools for our agency. Not only does it provide us with a place to share knowledge with our audience, it provides different ways for our audience to engage with us.

One of the easiest ways we are able to connect with our clients, partners, and community is by creating customizable call-to-action buttons to display in various places on our site. These buttons allow our site visitors to sign up for our newsletter, schedule a time to chat with us, register for a webinar, or any other action we hope they take. By having the ability to customize each button (opposed to only having a generic contact us button), we can make sure the call-to-action buttons fits the content where they are displayed. Drupal 8 makes these buttons easy to create (once we set up our desired fields).

Different options for customizing CTA buttons.Easy Publishing in Drupal 8

All of these features in Drupal 8 allow me to share tailored content with our audience, without becoming bogged down by the technology. And because I know you were wondering, the time it took me to take this blog post from google doc to published? 3 minutes, 17 seconds.

We want to make your project a success.

Let's Chat.

Dries Buytaert: Drupal looking to adopt React

Mon, 10/02/2017 - 19:32

Last week at DrupalCon Vienna, I proposed adding a modern JavaScript framework to Drupal core. After the keynote, I met with core committers, framework managers, JavaScript subsystem maintainers, and JavaScript experts in the Drupal community to discuss next steps. In this blog post, I look back on how things have evolved, since the last time we explored adding a new JavaScript framework to Drupal core two years ago, and what we believe are the next steps after DrupalCon Vienna.

As a group, we agreed that we had learned a lot from watching the JavaScript community grow and change since our initial exploration. We agreed that today, React would be the most promising option given its expansive adoption by developers, its unopinionated and component-based nature, and its well-suitedness to building new Drupal interfaces in an incremental way. Today, I'm formally proposing that the Drupal community adopt React, after discussion and experimentation has taken place.

Two years ago, it was premature to pick a JavaScript framework

Three years ago, I developed several convictions related to "headless Drupal" or "decoupled Drupal". I believed that:

  1. More and more organizations wanted a headless Drupal so they can use a modern JavaScript framework to build application-like experiences.
  2. Drupal's authoring and site building experience could be improved by using a more modern JavaScript framework.
  3. JavaScript and Node.js were going to take the world by storm and that we would be smart to increase the amount of JavaScript expertise in our community.

(For the purposes of this blog post, I use the term "framework" to include both full MV* frameworks such as Angular, and also view-only libraries such as React combined piecemeal with additional libraries for managing routing, states, etc.)

By September 2015, I had built up enough conviction to write several long blog posts about these views (post 1, post 2, post 3). I felt we could accomplish all three things by adding a JavaScript framework to Drupal core. After careful analysis, I recommended that we consider React, Ember and Angular. My first choice was Ember, because I had concerns about a patent clause in Facebook's open-source license (since removed) and because Angular 2 was not yet in a stable release.

At the time, the Drupal community didn't like the idea of picking a JavaScript framework. The overwhelming reactions were these: it's too early to tell which JavaScript framework is going to win, the risk of picking the wrong JavaScript framework is too big, picking a single framework would cause us to lose users that favor other frameworks, etc. In addition, there were a lot of different preferences for a wide variety of JavaScript frameworks. While I'd have preferred to make a bold move, the community's concerns were valid.

Focusing on Drupal's web services instead

By May of 2016, after listening to the community, I changed my approach; instead of adding a specific JavaScript framework to Drupal, I decided we should double down on improving Drupal's web service APIs. Instead of being opinionated about what JavaScript framework to use, we would allow people to use their JavaScript framework of choice.

I did a deep dive on the state of Drupal's web services in early 2016 and helped define various next steps (post 1, post 2, post 3). I asked a few of the OCTO team members to focus on improving Drupal 8's web services APIs; funded improvements to Drupal core's REST API, as well as JSON API, GraphQL and OpenAPI; supported the creation of Waterwheel projects to help bootstrap an ecosystem of JavaScript front-end integrations; and most recently supported the development of Reservoir, a Drupal distribution for headless Drupal. There is also a lot of innovation coming from the community with lots of work on the Contenta distribution, JSON API, GraphQL, and more.

The end result? Drupal's web service APIs have progressed significantly the past year. Ed Faulkner of Ember told us: "I'm impressed by how fast Drupal made lots of progress with its REST API and the JSON API contrib module!". It's a good sign when a core maintainer of one of the leading JavaScript frameworks acknowledges Drupal's progress.

The current state of JavaScript in Drupal

Looking back, I'm glad we decided to focus first on improving Drupal's web services APIs; we discovered that there was a lot of work left to stabilize them. Cleanly integrating a JavaScript framework with Drupal would have been challenging 18 months ago. While there is still more work to be done, Drupal 8's available web service APIs have matured significantly.

Furthermore, by not committing to a specific framework, we are seeing Drupal developers explore a range of JavaScript frameworks and members of multiple JavaScript framework communities consuming Drupal's web services. I've seen Drupal 8 used as a content repository behind Angular, Ember, React, Vue, and other JavaScript frameworks. Very cool!

There is a lot to like about how Drupal's web service APIs matured and how we've seen Drupal integrated with a variety of different frameworks. But there is also no denying that not having a JavaScript framework in core came with certain tradeoffs:

  1. It created a barrier for significantly leveling up the Drupal community's JavaScript skills. In my opinion, we still lack sufficient JavaScript expertise among Drupal core contributors. While we do have JavaScript experts working hard to maintain and improve our existing JavaScript code, I would love to see more experts join that team.
  2. It made it harder to accelerate certain improvements to Drupal's authoring and site building experience.
  3. It made it harder to demonstrate how new best practices and certain JavaScript approaches could be leveraged and extended by core and contributed modules to create new Drupal features.

One trend we are now seeing is that traditional MV* frameworks are giving way to component libraries; most people seem to want a way to compose interfaces and interactions with reusable components (e.g. libraries like React, Vue, Polymer, and Glimmer) rather than use a framework with a heavy focus on MV* workflows (e.g. frameworks like Angular and Ember). This means that my original recommendation of Ember needs to be revisited.

Several years later, we still don't know what JavaScript framework will win, if any, and I'm willing to bet that waiting two more years won't give us any more clarity. JavaScript frameworks will continue to evolve and take new shapes. Picking a single one will always be difficult and to some degree "premature". That said, I see React having the most momentum today.

My recommendations at DrupalCon Vienna

Given that it's been almost two years since I last suggested adding a JavaScript framework to core, I decided to talk bring the topic back in my DrupalCon Vienna keynote presentation. Prior to my keynote, there had been some renewed excitement and momentum behind the idea. Two years later, here is what I recommended we should do next:

  • Invest more in Drupal's API-first initiative. In 2017, there is no denying that decoupled architectures and headless Drupal will be a big part of our future. We need to keep investing in Drupal's web service APIs. At a minimum, we should expand Drupal's web service APIs and standardize on JSON API. Separately, we need to examine how to give API consumers more access to and control over Drupal's capabilities.
  • Embrace all JavaScript frameworks for building Drupal-powered applications. We should give developers the flexibility to use their JavaScript framework of choice when building front-end applications on top of Drupal — so they can use the right tool for the job. The fact that you can front Drupal with Ember, Angular, Vue, React, and others is a great feature. We should also invest in expanding the Waterwheel ecosystem so we have SDKs and references for all these frameworks.
  • Pick a framework for Drupal's own administrative user interfaces. Drupal should pick a JavaScript framework for its own administrative interface. I'm not suggesting we abandon our stable base of PHP code; I'm just suggesting that we leverage JavaScript for the things that JavaScript is great at by moving relevant parts of our code from PHP to JavaScript. Specifically, Drupal's authoring and site building experience could benefit from user experience improvements. A JavaScript framework could make our content modeling, content listing, and configuration tools faster and more application-like by using instantaneous feedback rather than submitting form after form. Furthermore, using a decoupled administrative interface would allow us to dogfood our own web service APIs.
  • Let's start small by redesigning and rebuilding one or two features. Instead of rewriting the entirety of Drupal's administrative user interfaces, let's pick one or two features, and rewrite their UIs using a preselected JavaScript framework. This allows us to learn more about the pros and cons, allows us to dogfood some of our own APIs, and if we ultimately need to switch to another JavaScript framework or approach, it won't be very painful to rewrite or roll the changes back.
Selecting a JavaScript framework for Drupal's administrative UIs

In my keynote, I proposed a new strategic initiative to test and research how Drupal's administrative UX could be improved by using a JavaScript framework. The feedback was very positive.

As a first step, we have to choose which JavaScript framework will be used as part of the research. Following the keynote, we had several meetings at DrupalCon Vienna to discuss the proposed initiative with core committers, all of the JavaScript subsystem maintainers, as well as developers with real-world experience building decoupled applications using Drupal's APIs.

There was unanimous agreement that:

  1. Adding a JavaScript framework to Drupal core is a good idea.
  2. We want to have sufficient real-use experience to make a final decision prior to 8.6.0's development period (Q1 2018). To start, the Watchdog page would be the least intrusive interface to rebuild and would give us important insights before kicking off work on more complex interfaces.
  3. While a few people named alternative options, React was our preferred option, by far, due to its high degree of adoption, component-based and unopinionated nature, and its potential to make Drupal developers' skills more future-proof.
  4. This adoption should be carried out in a limited and incremental way so that the decision is easily reversible if better approaches come later on.

We created an issue on the Drupal core queue to discuss this more.

Conclusion Drupal should support a variety of JavaScript libraries on the user-facing front end while relying on a single shared framework as a standard across Drupal administrative interfaces.

In short, I continue to believe that adopting more JavaScript is important for the future of Drupal. My original recommendation to include a modern JavaScript framework (or JavaScript libraries) for Drupal's administrative user interfaces still stands. I believe we should allow developers to use their JavaScript framework of choice to build front-end applications on top of Drupal and that we can start small with one or two administrative user interfaces.

After meeting with core maintainers, JavaScript subsystem maintainers, and framework managers at DrupalCon Vienna, I believe that React is the right direction to move for Drupal's administrative interfaces, but we encourage everyone in the community to discuss our recommendation. Doing so would allow us to make Drupal easier to use for site builders and content creators in an incremental and reversible way, keep Drupal developers' skills relevant in an increasingly JavaScript-driven world, move us ahead with modern tools for building user interfaces.

Special thanks to Preston So for contributions to this blog post and to Matt Grill, Wim Leers, Jason Enter, Gábor Hojtsy, and Alex Bronstein for their feedback during the writing process.

Acro Media: Video: Shipping in Drupal Commerce 2.x is Better Than Ever!

Mon, 10/02/2017 - 14:45

“Shipping” in Commerce 1 meant “get shipping rates.” End of story. If you wanted to do something crazy like actually receive the item or put it in a box in the warehouse, you were out of luck. You could integrate with another system, but otherwise you were really just a storefront.

But Commerce 2.x is a different story. Now you can go from getting rates all the way down to actually receiving the shipment.

Amazee Labs: DrupalCon Vienna Friday Sprints

Mon, 10/02/2017 - 13:17
DrupalCon Vienna Friday Sprints

At the end of a great month of cycling, a great week of summits, pieces of training, keynotes and more at #DrupalConEUR, the last and final day of this week-long conference was all about sprinting. Let me share my wrap-up of the DrupalCon’s Friday sprints in this blog post.

Josef Dabernig Mon, 10/02/2017 - 13:17

The Messe Wien conference center was split up into 3 areas: the first-time sprinter workshop, mentored core sprints as well as general sprints. Let’s go through them one by one.

1) The first-time sprinter workshop, brings new contributors up to speed with setting up a Drupal 8 environment, understand the contribution process and find their first novice issues to tackle. This process has been tested at various previous DrupalCons and turns out to be highly effective at recruiting and onboarding potential future Drupal contributors.

The group of sprint mentors runs through duties in the morning. Rachel Lawson (rachel_norfolk) blogged about her experience working together with the highly dedicated team of mentors.

At the first-time sprinter workshop, besides learning tools, processes and the technology, the main emphasis is on being able to collaborate in-person with other community members such as in this case Jen Lampton (jenlampton) from the US together with Chris Maiden (matason) from the UK.
 

2) The mentored core sprints are designed to take those who have gotten their feet wet in the first-time sprinter workshop or already have prior contribution experience to the next level. The setup of the second room with round tables focused on different topics such as Drupal core subsystems or initiatives allows engaging directly with mentors specialized in those skill areas. New contributors will work side-by-side with experienced core contributors on core tasks.

Mentors, such as Fatima Sarah Kahlid (sugaroverflow) from Canada, provide individual advice to those sprinting on an issue. The goal is to help a new contributor on their way through the process and learn from each other.

The mentors all wore green t-shirts and we used name tags for every attendee to make sure it’s easy to know who can help and lower the bar for memorizing hundreds of names within a few hours. This is Michael Lenahan (michaellenahan) making an announcement to the crowd of sprinters at DrupalCon Vienna.

 

3) The general sprints are where all the other magic happens. You will find other Drupal core initiatives and Drupal module maintainers sprint together on topics they care about being moved forward. It is similar to the mentored core sprints format, as we have tables that focus on certain topics but without the official sprint mentors and rather each initiative self-organized with or without a given structure.

A huge spreadsheet is used every year to pre-organize sprints. Here individuals can sign-up for sprints happening during the week and take part in individual sprint initiatives such as working on “Drupal 8 criticals and majors” or “Migrate” or “Usability / Redesign the Admin UI”.

A busy and growing table was the “Search API Family” where Thomas Seidl (drunken monkey) sprinted together with many other contributors on Search API and related modules such as Facets. Note that the Search API module has also been given the price in the Drupal category or the Open Minds award that we held during the week of DrupalCon on Tuesday. Together with Entity API by Wolfgang Ziegler (fago) and GraphQL by Sebastian Siemssen (fubhy) and Philipp Melab (pmelab) it was awarded as most valuable Drupal contributions from Austria.

---

The sprints were concluded with a very special moment, the Drupal Core Live Commit.

Lauri Eskola (lauriii), provisional core committer performed a live commit on stage. The seemingly trivial issue Add @internal to schemaDefinition() methods was reviewed and showed how the process works. The issue had been worked on by three contributors Valery Lourie (valthebald), Kevin Wenger (wengerk) and Gilles Doge (gido) until it went via the Active and Needs Review to Reviewed & tested by the community. Together with the approval from core committer Angie Byron (webchick), Lauri was able to commit the improvement not only to the latest 8.5.x development branch but also to 8.4.x which currently in release candidate mode.

Shannon Vettes (svettes) and Michael Schmid (schnitzel) also joined the stage to share what they sprinted on. This time it was about an initiative that isn’t necessarily related to writing code but helping drive change. Drupal-Petitions.org is designed to create a process & tool similarly to https://www.change.org/ or https://petitions.whitehouse.gov/ where the community can prioritize and gather momentum around ideas of improvements.

---

Wrapping up

Friday was all about sprints. As explained, I’m excited about the many ways that new and existing contributors had been working together.

Special thanks to all sprint mentors, to the great organization by the DrupalCon Events team as well as Thunder as the main sponsor for the Friday sprints.

More photos from Friday and the entire conference can be found in our Flickr collection. Interested in sprinting again? Watch out for Drupal Dev Days in 2018 or other upcoming Drupal events in your area.

Comic Relief Technology Blog: Waste not want not: Upcycle your tech!

Mon, 10/02/2017 - 11:18
Working in the charity sector you learn to be pretty resourceful when you need to be, and that doesn’t stop… Read More Waste not want not: Upcycle your tech!

ADCI Solutions: What's the difference between single-page application and multi-page application?

Mon, 10/02/2017 - 09:51

SPA approach of website developing is on rise. It’s cool, it’s popular. Everybody wants to chime in and participate. Don’t forget about multi-page approach though: there are many use cases you may love.

 

Read the whole article and learn how to apply those approaches using Drupal, React, Vue.js.

 

OSTraining: Drupal 8 or Drupal 7

Mon, 10/02/2017 - 02:00

Drupal has long been a techie's choice of open source content management system. It may be harder than WordPress or Joomla to setup but it more than makes up for this with its power and flexibility.

Does Drupal 8 continue this tradition?

Drupal Modules: The One Percent: Remembering Kirk Clawes

Sun, 10/01/2017 - 19:30
Remembering Kirk Clawes NonProfit Sun, 10/01/2017 - 12:30

Vardot: Best Drupal Blogs: List of Valuable Resources To Subscribe To

Sat, 09/30/2017 - 20:27
Best Drupal Blogs: List of Valuable Resources To Subscribe To Dmitrii Susloparov Sat, 09/30/2017 - 21:27

Drupal professionals have to constantly upgrade their skills to keep up to date with technology. The good news is that much of the knowledge now is available online, and there is no more need to spend hours in the library looking for resources that can give answers to your questions. In the 21st century most of the topics are covered in different blogs.

 

Vardot was featured as one of the top 20 Drupal blogs for Drupal developers. In this post, we recommend several resources (in addition to the one you are reading now of course) for you to subscribe. We believe that these resources will give you an excellent overall picture of what is happening in the Drupal community.

 

 

Drupal Blogs You Should Be Reading in 2017 Dries Buytaert blog

Dries' personal blog offers a glimpse of his work at Acquia and his views on Drupal and open-source software, in addition to general news and his opinions about the Drupal community.

 

If you are looking for low-level Drupal tips from the grand master himself, this is not the source for it. Instead, you will find a high-level and strategic perspective of where Drupal has trekked before and where it is heading, from none other than its creator. It will keep you well-informed of Drupal trends.

 

In our opinion, Dries’ blog is simply the best online resource for catching Drupal trends and formalizing your Drupal strategy.

Acquia blog

Acquia is the company that Dries Buytaert co-founded to provide cloud-based Drupal services, and according to a recent report, the number 1 organization for code contribution to Drupal in the 12-month period ending in June 30, 2017. The Acquia blog publishes posts by Dries, other Acquia insiders, and guest bloggers about 4 times a week.

 

This blog is the mother lode of knowledge about all things related to delivering Drupal enterprise solutions. You will find posts on best practices, architectural considerations, marketing trends, etc, on full-cycle Drupal commercialization. Developers should take note of posts from the Acquia Developer Center.

 

If you want to learn more about delivering enterprise Drupal solutions, the Acquia blog is a great resource. Vardot is proud to partner with Acquia to deliver professional hosting and training services.

Lullabot blog

The Lullabot blog averages about 2 new posts per week, and its target audience is enterprise Drupal developers. Building a modern enterprise Drupal website involves integrating multiple open-source technologies that must work well together. Consequently, enterprise developers must be well-rounded in various open-source technologies in addition to Drupal. The Lullabot blog has an excellent coverage of the entire Drupal technology stack.
 

One great feature about this blog is that it also features a library of podcasts on various Drupal topics. If you have a long commute, these Drupal podcasts are a great means for making good use of your time. (Another good source of Drupal podcasts is DrupalEasy.)

 

If your interests are entirely developer-centric, you may want to subscribe to the Lullabot feed.

Drupalize.Me blog

Drupalize.Me, a sister company to Lullabot, runs a website dedicated to Drupal developer training. It is made up of 2 main components: a blog and a series of technical guides/tutorials. The Drupalize.Me blog mainly posts Drupal community news, and announcements about new Drupalize.Me guides. A small proportion of the guides are free (samplers), while the rest are available for a monthly membership fee.

 

Despite the paid subscription model, Drupalize.Me offers arguably the most systematic approach for Drupal developers of all skill levels to upgrade their Drupal expertise online. The guides are categorized into topics: introduction to Drupal (including Drupal 8), site building, theming, module development (including API), site administration, and backend and infrastructure. The guides cover multiple Drupal versions, including the latest Drupal 8 as well as the older Drupal 6 and 7.

 

Drupalize.Me is a good investment for Drupal developers for continuing their Drupal training because of its breadth in topics and its depth in skill level. For a detailed list of the main online resources for learning Drupal, please consult this Vardot guide.

Volacci's Drupal SEO blog

Volacci's Drupal SEO blog, as its name suggests, targets marketing professionals rather than developers. Marketing has become a critical component in the Drupal community as evident in the recent DrupalCon Vienna 2017. DrupalCon hosted the very first Drupal Marketing Sprint in the DrupalCon Vienna program. So, we include Volacci’s high-caliber Drupal SEO and marketing blog on our recommended subscription list.

 

This blog is updated with a new post about once every 2 weeks. It covers Drupal industry news, SEO techniques and best practices. Ben Finklea, CEO and the primary author of the blog, is a world-renowned Drupal SEO expert. He was also the presenter for the Drupal 8 SEO hands-on seminar at DrupalCon Baltimore 2017.

 

If you are strictly interested in the SEO and marketing perspectives of Drupal, this is a blog that you should definitely follow. For additional quality SEO-related posts, please refer to the SEO tag in our blog.

 

Don't want to read too many Drupal blogs at the same time?

 

No problem, there are several resources where you can find latest news about Drupal from all over the world. Honorable mentions of blogs worthy of your subscription are listed below.

Planet Drupal

This is the official Drupal blog. It aggregates posts from a pre-approved list of Drupal-related blogs. The volume is quite high, about 40 posts per week. The scope spans a broad spectrum of development as well as business and marketing topics.

Reddit Drupal

Reddit Drupal is another high-volume website that covers anything Drupal-related. Because it is being hosted on the Reddit platform, you will find the website more interactive than the other Drupal blogs. You can ask questions directly on reddit or search through the existing posts for possible answers.

The Weekly Drop

This is a handcrafted weekly digest of the best Drupal-related blog posts from each week. If you find following multiple Drupal blogs too time-consuming, you should consider subscribing to the Weekly Drop which can keep you up-to-date with a minimal weekly drop of relevant articles.

Drupal Association Youtube channel

If you could not personally attend a DrupalCon conference, the best consolation is to watch the video recordings of its always educational workshops on Youtube. The Drupal Association Youtube channel has been updated with the workshops presented at the recent DrupalCon Vienna 2017.

 

To keep abreast of developments in the fast-changing Drupal community, we recommend that our readers subscribe to the above Drupal blogs in addition to Vardot’s own. And what is your favorite Drupal blog?

 

qed42.com: Securing Cookie for 3rd Party Identity Management in Drupal

Sat, 09/30/2017 - 10:15
Securing Cookie for 3rd Party Identity Management in Drupal Body

We are in an era where we see a lots of third party integrations being done in projects. In Drupal based projects, cookie management is done via Drupal itself to maintain session, whether it be a pure Drupal project or decoupled Drupal project,.

But what when we have a scenario where user’s information is being managed by a third party service and no user information is being saved on Drupal? And when the authentication is done via some other third party services? How can we manage cookie in this case to run our site session and also keep it secure?

One is way is to set and maintain cookie on our own. In this case, our user’s will be anonymous to Drupal. So, we keep session running based on cookies! The user information will be stored in cookie itself, which then can be validated when a request is made to Drupal.

We have a php function to set cookie called setCookie() , which we can use to create and destroy cookie. So, the flow will be that a user login request which is made to website is verified via a third party service and then we call setCookie function which sets the cookie containing user information. But, securing the cookie is must, so how do we do that?

For this, let’s refer to Bakery module to see how it does it. It contains functions for encrypting cookie, setting it and validating it.

To achieve this in Drupal 8, we will write a helper class let’s say “UserCookie.php” and place it in ‘{modulename}/src/Helper/’. Our cookie helper class will contain static methods for setting cookie and validating cookie. Static methods so that we will be able to call them from anywhere.

We will have to encrypt cookie before setting it so we will use openssl_encrypt() php function in following manner:

/** * Encrypts given cookie data. * * @param string $cookieData * Serialized Cookie data for encryption. * * @return string * Encrypted cookie. */ private static function encryptCookie($cookieData) { // Create a key using a string data. $key = openssl_digest(Settings::get('SOME_COOKIE_KEY'), 'sha256'); // Create an initialization vector to be used for encryption. $iv = openssl_random_pseudo_bytes(16); // Encrypt cookie data along with initialization vector so that initialization // vector can be used for decryption of this cookie. $encryptedCookie = openssl_encrypt($iv . $cookieData, 'aes-256-cbc', $key, OPENSSL_RAW_DATA, $iv); // Add a signature to cookie. $signature = hash_hmac('sha256', $encryptedCookie, $key); // Encode signature and cookie. return base64_encode($signature . $encryptedCookie); }
  1. String parameter in openssl_digest can be replaced with any string you feel like that can be used as key. You can keep simple keyword too.
  2. Key used should be same while decryption of data.
  3. Same initialization vector will be needed while decrypting the data, so to retrieve it back we append this along with cookie data string.
  4. We also add a signature which is generate used the same key used above. We will verify this key while validating cookie.
  5. Finally, we encode both signature and encrypted cookie data together.

For setting cookie:
 

/** * Set cookie using user data. * * @param string $name * Name of cookie to store. * @param mixed $data * Data to store in cookie. */ public static function setCookie($name, $data) { $data = (is_array($data)) ? json_encode($data) : $data; $data = self::encrypt($data); setcookie($name, $cookieData,Settings::get('SOME_DEFAULT_COOKIE_EXPIRE_TIME'), '/'); }

Note: You can keep 'SOME_COOKIE_KEY' and 'SOME_DEFAULT_COOKIE_EXPIRE_TIME' in your settings.php. Settings::get() will fetch that for you.
Tip: You can also append and save expiration time of cookie in encrypted data itself so that you can also verify that at time of decryption. This will stop anyone from extending the session by setting cookie timing manually.

Congrats! We have successfully encrypted the user data and set it into a cookie.

Now let’s see how we can decrypt and validate the same cookie.

To decrypt cookie:

/** * Decrypts the given cookie data. * * @param string $cookieData * Encrypted cookie data. * * @return bool|mixed * False if retrieved signature doesn't matches * or data. */ public static function decryptCookie($cookieData) { // Create a key using a string data used while encryption. $key = openssl_digest(Settings::get('SOME_COOKIE_KEY'), 'sha256'); // Reverse base64 encryption of $cookieData. $cookieData = base64_decode($cookieData); // Extract signature from cookie data. $signature = substr($cookieData, 0, 64); // Extract data without signature. $encryptedData = substr($cookieData, 64); // Signature should match for verification of data. if ($signature !== hash_hmac('sha256', $encryptedData, $key)) { return FALSE; } // Extract initialization vector from data appended while encryption. $iv = substr($string, 64, 16); // Extract main encrypted string data which contains profile details. $encrypted = substr($string, 80); // Decrypt the data using key and // initialization vector extracted above. return openssl_decrypt($encrypted, 'aes-256-cbc', $key, OPENSSL_RAW_DATA, $iv); }
  1. We generate the same key using same string parameter given while encryption.
  2. Then we reverse base64 encoding as we need extract signature to verify it.
  3. We generate same signature again as we have used the same key which was used to creating signature while encryption. If doesn’t signatures doesn’t matches, validation fails!
  4. Else, we extract initialization vector from the encrypted data and use to decrypt the data return to be utilized.
/** * Validates cookie. * * @param string $cookie * Name of cookie. * * @return boolean * True or False based on cookie validation. */ public static function validateCookie($cookie) { if (self::decryptCookie($cookieData)) { return TRUE; } return FALSE; }

We can verify cookie on requests made to website to maintain our session. You can implement function for expiring cookie for simulating user logout. We can also use decrypted user data out of cookie for serving user related pages.

navneet.singh Sat, 09/30/2017 - 13:45

Bay Area Drupal Camp: Training Registration for BADCamp 2017 is Open!

Fri, 09/29/2017 - 20:31
Training Registration for BADCamp 2017 is Open! Grace Lovelace Fri, 09/29/2017 - 11:31am Training Signups are Now Open!

Are you prepared to gain mastery of your Drupal Skills? BADCamp has two full days of training offered from some of the most talented leaders in the Drupal community. Join the masters on Wednesday and Thursday while they unfold the magic. This year BADCamp offers skills training in DevOps, theming, module development, content strategy, and much more!

  All courses will be all-day (approximately 8am-5pm) with a break for lunch. Signup today -- openings go quickly, and classes will fill up fast.

 

Signup Today


BADCamp has historically provided a completely free training thanks to the overwhelming generosity of our sponsors. However, this year we must charge a nominal fee of $25 to cover operating expenses as we are short on sponsorship funding. We sincerely apologize for this short notice. We needed to find ways at the last minute to break even.

This was a really difficult decision for the BADCamp organizers to make.

If you can't afford the $25 or it is super complicated to get funding, please reach out to the BADCamp organizers via the contact form and we will help! We have had generous attendees offer to donate extra seats in the classes.

Thank you for your understanding.

BADCamp is 100% volunteer run and 100% funded by our sponsors and the support of our local community. Thank you!


Getting Started with Drupal - Wednesday

by Agaric & Digital Echidna with Mauricio Dinarte

This training is aimed to people just starting with Drupal. Basic concepts will be explained are later put into practice. The objective is that someone, who might not even know about Drupal, can understand the different concepts and building blocks to create a website using this CMS.

 

SEO & Accessibility - Wednesday

by Hook 42 with Aimee Degnan and Carie Fisher

SEO stands for "Search Engine Optimization." Improving your website's SEO can translate into more visitors, better conversions, and more sales.

Accessibility refers to the design of products, devices, services, or environments for people who experience disabilities.

When properly configured, Drupal is a very SEO-friendly and Accessible web framework. The trick is to know which Drupal modules you need to install and how to optimally configure them. Configuration doesn’t stop at the module level - a solid content strategy is required to make the most Accessible and optimized website. “Content is King” and our job is to make Drupal showcase content in the most effective way to all consumers and search engines.

 

Object Oriented PHP - Wednesday

by Chapter Three

With the move to Drupal 8 everyone who works in the PHP layer will be exposed to more and more to object­ oriented code. Come learn the basics of working with objects in PHP and how OOP can help you to write well­ structured code

 

Continuous Integration: From 0 to CI Hero - Wednesday

by Tandem with Alec Reynolds and Mike Pirog

Continuous Integration (CI) methodologies and tools can deliver huge efficiency gains for web development teams. However, overburdened with feature requests and new projects, many development teams never have the time to learn and implement a CI workflow. Now is that time.

In this training, we provide hands-on instruction in how to setup a continuous integration workflow for your team using Github, TravisCI, and several popular hosting platforms (Pantheon and Platform.sh).

 

Drupal Crash Course for Non-Developers - Wednesday

by Promet Source with Margaret Plett

Are you responsible for project management, content, or vendor selection and preparing to work with Drupal? This one-day training delivers all of the tools you need to get started. Delivered by an Acquia Certified Drupal Developer, this training will answer the questions you didn’t even know to ask!

 

Component-based Development in Drupal - Wednesday

by Mediacurrent with Mario Hernandez

In this training we will put into practice one of the latest latest trends in development, components. Building a website using the component-based approach can dramatically improve collaboration among teams, making code more reusable, flexibility and long term maintenance of your website. We will work on building a living styleguide which will become the single source of truth for markup, styles and javascript behaviors.

 

Component-based Theming with Twig - Thursday

by Forum One with Chaz Chumley

Join Forum One as they walk through the theming variations that started with the traditional theme-centric design and has quickly moved into component-based design. Together you will master Component-based theming with Twig as you work to identify patterns, define components, utilize command line tools such as Composer, NPM and Grunt to quickly create a PatternLab managed theme. Learn how to work smarter in developing components that can easily be integrated into project after project without having to recreate them yourself.

 

Hands on Drupal 8 Module Development using DrupalConsole - Thursday

by WeKnow with Jesus Manuel Olivas and Omar Aguirre

This training will provide an introduction to the most important changes for developers in Drupal 8, allowing students to practice Drupal OOP while at the same time providing a solid knowledge of the process of build modules for Drupal 8.

 

Theming Drupal 8 - Thursday

by Drupalize.me with Joe Shindelar

Themes combine HTML, CSS, JavaScript, and Drupal in order to make beautiful websites. Creating truly unique themes requires knowing how to use the Twig template language to manipulate HTML, how to add CSS and JavaScript assets in a way that's compatible with Drupal's caching, all while maintaining the flexibility that Drupal is known for.

 

Content Strategy for Drupal 8 - Thursday

by Evolving Web with Suzanne Deracheva

Drupal is a powerful tool for managing structured content. Many Drupal projects revolve around producing, displaying and organizing content effectively. This course will walk you through the process of creating a content strategy for your next Drupal project, and planning out how that content will be structured in Drupal. Whether you're creating a brand new site or migrating to Drupal, you'll learn techniques that will help you build a solid content strategy and a successful Drupal website.

 

Intro to Backdrop CMS - Thursday

by Nate & Jen Lampton

Backdrop CMS is for the small to medium sized business, non-profits, educational institutions, and companies or organizations who are delivering comprehensive websites on a budget. This introductory training will cover the basics of creating and administering a website with Backdrop CMS.

 

Drupal 8 Configuration System Basics - Thursday

by DrupalEasy with Mike Anello

The Drupal 8 configuration system can provide great advantages to managing the configuration of a site, but it can also cause massive headaches if used improperly. This presentation will provide an overview of how the Drupal 8 configuration system works, best practices on basic workflows to utilize it effectively, and a small sampling of some of the contributed modules available to enhance it.

  YOU make BADCamp awesome!

Would you have been willing to pay for your ticket?  If so, then you can give back to the camp by purchasing an individual sponsorship at the level most comfortable for you. As our thanks, we will be handing out some awesome BADCamp swag as our thanks.

  We need your help!

Do you want a more meaningful BADCamp experience? BADCamp would not be possible without the overwhelming love and support from our community! Volunteer to help set up, tear down, take pictures, monitor rooms or so much more!  If you are local and can help us, please contact Anne at anne@badcamp.net or sign up on our Volunteer Form.

  Sponsors

A HUGE shout out of thanks to our sponsors who have helped make this magnificent event possible. Interested in sponsoring BADCamp? Contact matt@badcamp.net or anne@badcamp.net

Thank you to Pantheon & Acquia for sponsoring at the Core level to help keep BADCamp free and profoundly reverential.

 

Drupal Planet

Lullabot: Fundamentals of Responsive Images

Fri, 09/29/2017 - 18:40

As a recovering English major, I’d like to believe words alone are enough to tell a tale on the web. A text-only page is fast: even a long article can load nearly instantly. Add photos, and the web slows down. Yet great images bring emotion, a connection with others and the world around us. They’re often worth the tradeoff in time to load a page.

People don’t want to wait around longer than necessary, though. Any benefit you get from a great image vanishes once someone’s neck begins to tense up as the loading bar slowly creeps from one side of the URL bar to the other.

undefined

Images also lose their emotional impact if they’re blurry and someone has to squint to see the subject.

If you take an image that looks nice and crisp on a phone, then share that same file on a big desktop screen, it’s going to look fuzzy. Switch it around with a nice, big image that looks great on desktop, and somebody looking at the same file on a phone will grow impatient, waiting for the file to load.

We want the best of both worlds: images that look great no matter which screen they’re viewed on, while loading as quickly as possible.

Thankfully there’s a great solution to this problem due to the work of the Responsive Images Community Group. They worked with browser developers to develop new markup options such as the picture element and the sizes and srcset attributes. With these tools, we can provide a selection of image files so your browser can pick the right one depending on how someone is viewing it. This can help to make sure photos download as fast as possible while still being enjoyed in all their glory.

There are a lot of great resources that help explain the new responsive images specification. I highly recommend Jason Grigsby’s article series, Responsive Images 101, as well as Eric Portis’ Responsive Image Guide.  You can read the actual specifications for the picture element or the srcset and sizes attributes, although specs can be pretty dry reading. Understanding the specifications and syntax are important, but you still need to make a number of key decisions on how you’ll use responsive images with a particular site.

I’ve set up responsive images on a number of large sites like NYU Langone, and I also help to maintain the Responsive Image and Breakpoint modules for Drupal 8, so I wanted to share some of my experiences in working with responsive images.

In this article, I’ll be explaining some of the key concepts for responsive images, as well as providing an overview of a few different responsive image tactics. The solutions used for any particular image will vary. Understanding those options will help you to set out on the right path.

I’ll dig into more technical detail in future articles focused on some of those individual tactics. Right now, let’s start looking at the various ways we can make sure our images look awesome and load fast.

Picking the right method to make your images responsive

The biggest difference in how you’ll handle making images responsive is what you’ll do for images that are photos, versus how you’ll handle logos and icons.

Photos—often referred to as raster images—do not scale so easily. The word raster comes from the Latin word rastrum, or rake. Old cathode ray tubes created images on screens by literally drawing one line at a time, raking each across the screen. These days raster images are created by hundreds of thousands to millions of individual dots, or pixels, on a screen. It takes a lot of data to tell a browser what color each of those dots should be in order to recreate an image.

undefined

Logos and icons on the other hand often use vector graphics. These images can be specified using mathematical vectors—a series of points on lines along with information that describes the curves connecting those points. The simpler shapes and colors in vector graphics can scale really easily to a wide variety of sizes, because math can easily calculate the color needed for each pixel.

For vector graphics, you’ll want to use SVG files. SVG means Scalable Vector Graphics, and the name really says it all. SVGs are text files which use XML to describe the vectors necessary to create an image. Use an SVG plus a little CSS, and your logos and icons will be responsive.

I would not recommend using an older technique to load icons through a webfont containing multiple icons. The goal of that was to avoid multiple requests to a server: with the the advent of the http/2 protocol, that’s not as much of an issue. Icon fonts also have major accessibility issues, since they use a specific letter of a font for each icon. For people using a screen reader, that’s not so awesome.

For photos and other raster images, the techniques you use might vary a bit, depending on if the images are loaded through CSS or HTML.

There are ways to make background images added to a site through CSS responsive, but unfortunately browser support can be a bit shaky.

Thankfully, images used as content within a site, which are loaded through the HTML markup for a page, have great options that we can use. These sorts of images are generally what people are referring to when you hear the term responsive images.

So we’ll be focusing mostly on raster images that appear as content on your site. Even there, however, there a few important variations to keep in mind.

How do images vary across breakpoints?

When we’re talking about making images responsive, we mean that we want to provide some variation in how those images appear depending on how they’re viewed.

Sometimes we want an image to essentially look the same whether you’re on mobile or desktop. For example the image always appears as a square or a rectangle. It might only fill a sidebar column on desktop, while filling the full width of the screen on mobile. However, it retains the same aspect ratio—the relationship between the height and the width of the image. 

We call this use case viewport sizing, and typically this ends up being the most common way that images are made responsive.

For viewport sizing, we typically just need a good ol’ img element with two new attributes: sizes and srcset. We’ll get into how those new attributes work, but the short version is that sizes tells a browser how much space an image takes up in a site’s layout at various screen sizes, while srcset provides some image file options the browser can choose between.

<img src="small.jpg" srcset="large.jpg 1024w, medium.jpg 640w, small.jpg 320w" sizes="(min-width: 36em) 33.3vw, 100vw" alt="A swirling nebula">

Sometimes we need images to change a bit more at various screen sizes. Maybe we need a square image on mobile but a rectangle on desktop. Sometimes we also need to change the cropping of an image across breakpoints, showing a close-up image on mobile, while using a wider shot on desktop. Changes to aspect ratio and cropping are often called art direction, and they require a more complicated solution.

For art direction, we’ll need to use the picture element, which serves as a wrapper around a series of source elements, along with an img element. Each source element has its own media attribute: the media query defines the viewport size range where that source should be used to select the particular file that will be used for the img element contained inside the picture element.

You can also provide a sizes and srcset attribute on a source element so the browser has a number of files it can choose between for a particular viewport range.

<picture> <source media="(min-width: 70em)" sizes="40vw" srcset="nebula-artsy-wide-2800.jpg 2800w, nebula-artsy-wide-2240.jpg 2240w, nebula-artsy-wide-1400.jpg 1400w, nebula-artsy-wide-1120.jpg 1120w"> <source media="(min-width: 35em)" sizes="36vw" srcset="nebula-artsy-square-1120.jpg 1120w, nebula-artsy-square-900.jpg 900w, nebula-artsy-square-560.jpg 560w, nebula-artsy-square-450.jpg 450w"> <img src="nebula-artsy-tight.jpg" alt="An artsy cat"> </picture>

Using the picture element is overkill when you’re just dealing with viewport sizing. Having separate source elements is great for art direction, though, because you can provide a set of files on one source element with a certain aspect ratio or cropping level, while using a different aspect ratio or cropping level on another source element.

There’s very good browser support for the picture element, as well as the sizes and srcset attributes. IE11 is the main browser that still needs a little help, and for that you will want to make sure you’re using the Picturefill polyfill. Doing so may change what you use as a fallback src on the img inside the picture element. See the example on the Picturefill site for details.

For either viewport sizing or art direction, you’ll likely want to use the sizes and srcset attributes, so let’s dig a little deeper into what purpose those attributes are serving. In short, it’s all about pixels versus percentages.

Responsive image grudge match: Pixels versus percentages

Responsive design typically specifies a site’s layout in percentages, while raster images like photos are defined in pixels. It’s a grudge match, and our job is to serve as referees.

In one corner, we have responsive web design, where percentages define layout. By using a percentage, we allow the browser to do the heavy lifting of figuring out for an element like a sidebar exactly how many pixels wide it should be for a particular screen size. This is great, since like vector images, a browser can easily calculate layout boxes through the power of mathematics.

In our other corner, we have photographic images. Photos are more difficult to resize, because we need to give a browser detailed instructions about every single pixel in the image. As a result, photos don’t flex so easily.

undefined

Image files are essentially information with instructions detailed enough to create a photo at a particular size. A browser can figure out how to make an image smaller than its file size would suggest, because it has enough information to do so. Making an image bigger is much trickier, because if a browser doesn’t have enough detailed instructions for a larger size, it has to start guessing. And inevitably, it will guess wrong at least some of the time, which leads to images looking blurry.

However, that doesn’t mean we can just give a browser so much information that it can draw an image at any potential size. A high-res image file is going to be way bigger than necessary for a much smaller, low-res screen. More information, bigger file size, longer download time.

Because pixels matter so much, we also have to keep in mind screen resolution. Some displays use a larger number of pixels in the same amount of physical space in order to create a more detailed image. For a low-res display, a sidebar that has a layout width of 500px will use 500 physical pixels in a screen to create that width. For a high-res “retina” screen, there may be 1000 physical pixels in that same space. That means we need a higher resolution image to account for that difference.

So we want to figure out for one particular type of image how many pixels of information we need in the file for the amount of space it takes up in a percentage of the site’s layout at a certain screen size and resolution. It’s okay if the file has a few more pixels of info, although not too many more, but we definitely want to avoid having too few pixels of info, so we can avoid blurry images.

The sizes attribute helps us to tell the browser about the layout percentages for a particular image, while srcset provides information about the number of pixels in each image file. Why do we need to put this information into HTML markup, though?

Why browsers need a sizes attribute

We need sizes because of how browsers process a web page that is being loaded. Once a browser receives the HTML for a page from the server, it begins scanning the document to look for other resources it will need to load. It finds all the CSS and JS documents that need to be loaded, as well as any image files, then begins prioritizing how it will download those files.

CSS files are a top priority, because they provide so much critical information about how a page’s HTML should be styled, and because CSS files often contain links to other resources that need to also be downloaded, such as web fonts and background images. JS is also a big priority, as it can change around the order of DOM elements that will need to be styled based on CSS rules. What’s critical to understand is that browsers improve overall performance by starting to download images while the CSS and JS files are still being processed.

That’s been a big challenge for responsive images, because you can’t just use CSS and JS to select an image file with the right width for a particular image slot, as doing so would mean waiting until all of the CSS and JS has been processed to fully understand the final layout of a page and thus the width of an image slot.

We can solve this tricky problem by using a key part of the responsive images spec: the sizes attribute. This attribute on an img element (or on a source element within a picture element) tells the browser how large that element will be once layout rules are applied.

So, within our sizes attribute, we provide a set of widths and accompanying media conditions. A media condition is simpler than a media query and consists only of something like (min-width: 1000px). For our example, we could provide the following sizes attribute:

sizes="(min-width: 36em) 33.3vw, 100vw"

The first thing to note is that we’re providing the media condition for the largest possible viewport first, as the browser will pick the first option that matches.  Next, note the units we’re using:

  • Using em for widths in media conditions is a good practice, because it provides extra flexibility for people who change the settings in their browser to use larger than normal default font sizes. The typical default font size is 16px for 1em, so 36em is the equivalent of 576px. So we’re saying when the browser has a minimum width of 576px, this image takes up 33.3vw space in the layout.
  • The vw unit stands for viewport width: 1vw is equal to 1% of the width of the viewport; 33.3vw is 33.3% of the viewport width. The vw unit is used instead of percentages to make clear that this is a percentage of the viewport, not the width of the containing element.

Finally, the comma indicates the next set of media conditions and layout data. You can have as many commas as necessary within a sizes attribute. Here we just have one, so we’re saying that for viewports smaller than 576px wide, this image takes up 100% of the viewport space.

Let the browser choose with srcset

The sizes attribute needs to be paired with a srcset attribute on the same element (either an img element or a source element). This attribute will contain a comma-separated list of the URLs of image files: after each URL there is a space, then a number signifying the width of the image followed by the letter w. For example:

srcset="image235.jpg 235w, image290.jpg 290w, image365.jpg 365w, image470.jpg 470w, image580.jpg 580w, image730.jpg 730w, image940.jpg 940w, image1160.jpg 1160w"
  • The browser will take a look at these image files and use the number with the w to calculate which image file will best fit within the amount of space we’ve defined in the sizes attribute.
  • The browser knows the viewport size, so it can pick the right media condition and then use the width value next to the media condition to calculate how many pixels are needed to fill that space.
  • The browser also knows the resolution density of the screen, so it can take that into account with its calculations as well.

The browser can also in theory take into account the bandwidth you have available, providing a lower-res source if you’re on a 3G connection perhaps.

You don’t need to worry about that, though. All you need to do is provide the layout information in the sizes attribute and the possible image sources that can fit within that space, and the browser will take care of matching up the right source file with the image slot.

Make images fluid with CSS

To make images responsive, we still need to write CSS rules that will make the image flexible. For example:

img { width: 100%; height: auto; }

If you’ve provided a set of images in srcset with sufficient widths for the amount of space defined in sizes, this should be all you need. If you don’t have a way to guarantee that, you could also add a max-width: 100%; rule that ensures images are never made larger in the layout than the number of pixels within the image. This can cause design discrepancies if your images are supposed to take up a certain amount of space in a grid design, so I tend to leave off this rule.

If I want to have an image take up a certain amount of space within its container, I find it works better to put a wrapper div around the img or picture element and then set layout rules on that wrapper. That way I can have one consistent CSS rule for all images, but then modify the width of the wrapper in the situations that need that.

Next steps

Hopefully you now have a better understanding of a few different types of images and how you might make each responsive.

The sizes and srcset attributes are often key to making images responsive. In an upcoming article, I’ll talk through how to look at a particular type of image for a site, and then determine what values to use for sizes and srcset. Creating that sort of plan is really key to a successful responsive images solution.

Once you have a plan, you still need to create all the image file variations, and if at all possible you should find a way to avoid creating those image files manually. In a separate article I’ll go over how to use Drupal 8’s built-in tools to automate this process. In a decoupled site, you may find a cloud-based tool works well for that part of the process: Kris Bulman will be going over how to do that in a future article.

Other articles in this series may go over topics like how to implement art direction for responsive images in Drupal 8.

The payoff for this effort is that your images look nice and crisp at all viewport sizes, while still downloading efficiently. No more super slow mobile sites that take forever to load images! That makes a huge difference for those visiting your site. Downloading image files tends to be a big chunk of the work that browsers do when visiting a new site. Speed that up and everybody wins.

Amazee Labs: The final day of DrupalCon Europe talks

Fri, 09/29/2017 - 18:09
The final day of DrupalCon Europe talks

#DrupalConEur is 3 days of talks surrounded by a day of summits and a day of collaboration sprints. Thursday was the 3rd and final day of presentations.

John Albin Fri, 09/29/2017 - 18:09

Most importantly for me, Thursday was the day after I finished giving my talk, so I was able to stop tweaking my slides and focus on learning. I started my day by grabbing a hazelnut croissant and coffee in the underground and headed to the community keynote by Joe Shindelar, “Everyone Has Something to Share”.

After that I went to Everett Zufelt’s ”JavaScript and Accessibility: Don't Blame the Language”. Everett busted several myths about accessibility including the pointed “Our web application is accessible (but we’ve never tested it)” And the most useful part of his talk was describing ways that websites get accessibility right. I've now added “ARIA Live Regions” to my TO DO list and highly recommend anyone making websites to watch the video for his presentation.

While I was grabbing a quick lunch, Tamás Hajas presented, as part of the Frontend Track, “What’s new in CSS? Introduction to CSS Grid and CSS Custom Properties”. I added the video to my YouTube “to watch” list and headed to Chris Ruppel’s “Houdini, the Future of CSS”. Chris’s talk was part of the Horizons track, which focuses on the future of Drupal and the web. Houdini is a proposed API that will go into web browsers that will give CSS developers the same ability that JavaScript developers already have; the capability to polyfill proposed changes to the CSS spec. With Houdini, CSS developers could potentially create new syntax (like nested selectors or element queries) and use that in their production code.

Earlier in the week, the Drupal Association announced there would be no DrupalCon Europe in 2018 and that they had formed a committee to determine if and/or how a DrupalCon Europe 2019 could happen. So with this in mind, Théodore Biadala, whose session was scheduled in the final time slot of the day, started his ”Offline Core” presentation by saying ”Thank you for coming to the last session of the last day of the last DrupalCon Europe. Ever.” Awwwww… (Hopefully it won’t be, but that’s another blog post) The “Offline Core” session was a part of the Core Conversations track and after a short presentation about his idea for supporting Progressive Web Apps (PWA) and Service Workers in core, Théodore facilitated a lively discussion with the session attendees on multiple facets of the idea. We even solved a potentially tricky problem: how to turn off a Service Worker (code running on a user's browser) after the website owner has disabled the Service Worker's module in Drupal.

For the past six years, the closing session is followed by Drupal Trivia Night! The Drupal-related trivia questions are written by the wonderful Drupal Ireland community. I attended the first trivia night at DrupalCon Chicago 2011 and never miss it when I go to DrupalCon. Tonight was the 15th Trivia Night. I know because this was one of the trivia questions (Dang it! I wrote down "16" as my answer.)

Since I’ve been in the Drupal Community for 13 years, I know a lot of trivia, but as usual, I did horribly. But winning Trivia Night is not the goal, having fun is and the Drupal Ireland team does a great job of getting everyone involved and happy while losing badly. For example, my team won an award for "favorite team name"; the team name we picked was "We love the Irish!"

Valuebound: How to create custom token to be used in default mail message template in Drupal 8

Fri, 09/29/2017 - 16:49

Sometimes we need to do similar coding in different places, such as for account settings email templates (Welcome email template, Forget password email template, from UI to get the same results. In this scenario, it's always suggested to create a custom token and use that in different types of email templates (account setting email templates) from Drupal UI in [token:type] format.

In our previous blog, we explored about creating custom tokens in Drupal 7 and in this, we will take a brief look at what is token? How to create it in Drupal 8?

In the vein of first thing first, what is token?

Tokens are very small bits of data or text that we use or place into…