Inhalt abgleichen
Drupal.org - aggregated feeds in category Planet Drupal
Aktualisiert: vor 19 Minuten 19 Sekunden

Lullabot: Incredible Decoupled Performance with Subrequests

12. Oktober 2017 - 15:52

In my previous post, Modern Decoupling is More Performant, we discussed how saving HTTP round-trips has a very positive impact on performance. In particular, we demonstrated how the JSON API module could help your application by returning multiple entities in a single request. Doing so eliminates the need for making an individual request per entity. However, this is only possible when fetching entities, not when writing data and only if those entities are related to the entry point (a particular entity or collection).

Sometimes you can solve this problem by writing a custom resource in the back-end every time, but that can lead to many custom resources, which impacts maintainability and is tiresome. If your API is public and you don’t have prior knowledge of what the consumers are going to do with it, it’s not even possible to write these custom endpoints.

The Subrequests module completes that idea by allowing ANY set of requests to be aggregated together. It can aggregate them even when one of them depends on a previous response. The module works with any request, it's not limited to REST or any other constraint. For simplicity, all the examples here will make requests to JSON API.

Why Do We Need It?

The main concept of the Subrequests module is that instead of sending multiple requests to your Drupal instance we will only send a single request. In this master request, we will provide the information about the requests we need to make in a JSON document. We call this document blueprint.

A blueprint is a JSON document containing the instructions for Drupal to make all those requests in our name. The blueprint document contains a list of subrequest objects. Each subrequest object contains the information about a single request being aggregated in the blueprint.

Imagine that our consumer application has a decoupled editorial interface. This editorial interface contains a form to create an article. As part of the editorial experience, we want the form to create the article and a set of tags in the Drupal back-end.

Without using Subrequests, the consumer application should execute the following requests when the form is submitted:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user, based on the username present in the editorial app.
  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.
  • Create the article in the form using the user UUID and the newly created tags.

We can query for the user and the vocabulary in parallel. Once that is done, and using the information in the vocabulary response, we can create the tag entities. Once those are created, we can finally create the article. In total, we would be making five requests at three sequential levels. And, this is not even a complex example!

undefined

A JavaScript pseudo-code for the form submission handler could look like:

console.log('Article creation started…'); Promise.all([ httpRequest('GET', 'https://cms.contentacms.io/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags'), httpRequest('GET', 'https://cms.contentacms.io/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin'), ]) .then(res => { const [vocab, user] = res; return Promise.all([ Promise.resolve(user), httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag1, headers), httpRequest('POST', 'https://cms.contentacms.io/api/tags', bodyForTag2, headers), ]) }) .then(res => { const [user, tag1, tag2] = res; const body = buildBodyForArticle(formData, user, tag1, tag2); return httpRequest('POST', 'https://cms.contentacms.io/api/articles', body, headers); }) .then(() => { console.log('Article creation finished!'); }); Using Subrequests

Our goal is to have JavaScript pseudo-code that looks like:

console.log('Article creation started…'); const blueprint = buildBlueprint(formData); httpRequest('POST', 'https://cms.contentacms.io/api/subrequests?_format=json', blueprint, headers) .then(() => { console.log('Article creation finished!'); });

We've reduced our application code to a single POST request that contains a blueprint in the request body. We have reduced the problem to the blueprint creation. That is a big improvement in the developer experience of consumer applications.

undefined Parallel Requests

In our current task we need to perform two initial HTTP requests that can be run in parallel:

  • Query Drupal to find the UUID for the tags vocabulary.
  • Query Drupal to find the UUID of the user based on the username in the editorial app.

That translates to the following blueprint:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": ["Accept": "application/vnd.application+json"] }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": ["Accept": "application/vnd.application+json"] } ]

For each subrequest, we can observe that we are providing four keys:

  • requestId A string used to identify the subrequest. This is an arbitrary value generated by the consumer application.
  • action Identifies the action being performed. A "view" action will generate a GET request. A "create" action will generate a POST request, etc.
  • uri The URL where the subrequest will be sent .
  • headers An object containing the headers specific for this subrequest.

The response to this blueprint (after adjusting the permissions in Drupal to view users and vocabularies) will return the response to both subrequests:

{ "vocabulary": { "headers": { "content-id": ["<vocabulary>"], "status": [200] }, "body": "{\"data\":[{\"type\":\"vocabularies\",\"id\":\"47ce8895-0df6-44a4-af43-9ef3b2a924dd\",\"attributes\":{\"status\":true,\"dependencies\":{\"module\":[\"recipes_magazin\"]},\"_core\":\"HJlsFfKP4PFHK1ub6QCSNFmzAnGiBG7tnx53eLK1lnE\",\"name\":\"Tags\",\"vid\":\"tags\",\"description\":\"Use tags to group articles on similar topics into categories.\",\"hierarchy\":0,\"weight\":0},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies\\/47ce8895-0df6-44a4-af43-9ef3b2a924dd\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/vocabularies?filter%5Bvid-filter%5D%5Bcondition%5D%5Bpath%5D=vid\\u0026filter%5Bvid-filter%5D%5Bcondition%5D%5Bvalue%5D=tags\"}}" }, "user": { "headers": { "content-id": ["<user>"], "status": [200] }, "body": "{\"data\":[{\"type\":\"users\",\"id\":\"a0b7af80-e319-4271-899f-f151d3fbfc8e\",\"attributes\":{\"internalId\":1,\"name\":\"admin\",\"mail\":\"admin@example.com\",\"timezone\":\"Europe\\/Madrid\",\"isActive\":true,\"createdAt\":\"2017-09-15T15:47:26+0200\",\"updatedAt\":\"2017-09-15T20:06:15+0200\",\"access\":1505565434,\"lastLogin\":\"2017-09-15T20:06:07+0200\"},\"relationships\":{\"roles\":{\"data\":[]}},\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users\\/a0b7af80-e319-4271-899f-f151d3fbfc8e\"}}],\"links\":{\"self\":\"http:\\/\\/localhost\\/api\\/users?filter%5Badmin%5D%5Bcondition%5D%5Bpath%5D=name\\u0026filter%5Badmin%5D%5Bcondition%5D%5Bvalue%5D=admin\"}}" } }

In the (simplified) response above we can see that for each subrequest, we have one key in the response object. That key is the same as our requestId in the blueprint. Each one of the subresponses contains the information about the response headers and the response body. Note how the response body is an escaped JSON object.

This blueprint is not sufficient to create an article with two tags, but it's a great start. Let's build on top of that to create the tags and the article.

Dependent Requests

The next task we need to execute is the creation of the two tag entities:

  • Create the first tag in the form using the vocabulary UUID.
  • Create the second tag in the form using the vocabulary UUID.

To do this, we will need to expand the blueprint. However, we don't know the vocabulary UUID at the time we are writing the blueprint. What we do know is that the vocabulary UUID will be in the subresponse to the vocabulary subrequest. In particular, we can find the UUID in data[0].id.

We will use that information to create a blueprint that can create tags. Since we don't know the actual value of the vocabulary UUID, we will use a replacement token. At some point, during the blueprint processing by Drupal, the token will be resolved to the actual UUID value.

Replacement Tokens

We can use replacement tokens anywhere in the body or the URI of our subrequests. For those to be resolved, a token needs to be formatted in the following way:

{{<requestId>.<"body"|"headers">@<json-path-expression>}}

In particular, the replacement token for our vocabulary UUID will be:

{{vocabulary.body@$.data[0].id}}

What this replacement says is:

  1. Use the subresponse for the vocabulary subrequest.
  2. Take the body from that subresponse.
  3. Extract the string under data[0].id, by executing the JSON Path expression $.data[0].id. You can execute any JSON Path expression as long as it returns a string. JSON Path is a very powerful way to extract data from an arbitrary JSON object, in our case the body in subresponse to the vocabulary subrequest.

This is what our blueprint looks like after adding the subrequests to create the tag entities. Note the presence of the replacement tokens:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": {"Accept": "application/vnd.api+json"} }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": {"Accept": "application/vnd.api+json"} }, { "action": "create", "requestId": "tags-1", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "tags-2", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] } ]

Note that to use a replacement token in a subrequest, we need to add a dependency on the subresponse that contains the information. That's why we added the waitFor key in our tag subrequests.

Finishing the Blueprint undefined

Using the same principles that we used for the tags we can add the subrequest for:

  • Create the article in the form using the user UUID and the newly created tags.

That will leave our completed blueprint as:

[ { "requestId": "vocabulary", "action": "view", "uri": "/api/vocabularies?filter[vid-filter][condition][path]=vid&filter[vid-filter][condition][value]=tags", "headers": {"Accept": "application/vnd.api+json"} }, { "requestId": "user", "action": "view", "uri": "/api/users?filter[admin][condition][path]=name&filter[admin][condition][value]=admin", "headers": {"Accept": "application/vnd.api+json"} }, { "action": "create", "requestId": "tags-1", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My First Tag\"},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "tags-2", "body": "{\"data\":{\"type\":\"tags\",\"attributes\":{\"name\":\"My Second Tag\",\"description\":null},\"relationships\":{\"vocabulary\":{\"data\":{\"type\":\"vocabularies\",\"id\":\"{{vocabulary.body@$.data[0].id}}\"}}}}}", "uri": "/api/tags", "headers": {"Content-Type": "application/vnd.api+json"}, "waitFor": ["vocabulary"] }, { "action": "create", "requestId": "article", "headers": {"Content-Type": "application/vnd.api+json"}, "body": "{\"data\":{\"type\":\"articles\",\"attributes\":{\"body\":\"Custom value\",\"default_langcode\":\"1\",\"langcode\":\"en\",\"promote\":\"1\",\"status\":\"1\",\"sticky\":\"0\",\"title\":\"Article Created via Subrequests!\"},\"relationships\":{\"tags\":{\"data\":[{\"id\":\"{{tags-1.body@$.data.id}}\",\"type\":\"tags\"},{\"id\":\"{{tags-2.body@$.data.id}}\",\"type\":\"tags\"}]},\"type\":{\"data\":{\"id\":\"article\",\"type\":\"contentTypes\"}},\"owner\":{\"data\":{\"id\":\"{{user.body@$.data[0].id}}\",\"type\":\"users\"}}}}}", "uri": "/api/articles", "waitFor": ["user", "tags-1", "tags-2"] } ] More Powerful Replacements

Imagine that instead of creating an article for a single user, we wanted to create an article for each one of the users on the site. We cannot write a simple blueprint, like the one above, since we don't know how many users there are in the Drupal site. Hence, we cannot write an article creation subrequest for each user.

To solve this problem we can tweak the user subrequest, so instead of returning a single user it returns all the users in the site:

[ … { "requestId": "user", "action": "view", "uri": "/api/users", "headers": {"Accept": "application/vnd.api+json"} }, … ]

Then in our replacement tokens, we can write a JSON Path expression that will return a list of user UUIDs, instead of a single string. Subrequests will accept JSON Path expressions that return either strings or an array of strings for the replacement tokens.

In our article creation subrequest we will need to change {{user.body@$.data[0].id}} by {{user.body@$.data[*].id}}. The Subrequests module will create a duplicate of the article subrequest for each replacement item. In our case this will have the effect of having a copy of the article creation subrequest per each available user in the user subresponse.

The Final Response

The modified blueprint that generates one article per user will have a response like:

undefined

We can see how a single subrequest can generate n subresponses, and we can use each one of those to generate n other subresponses, etc. This highlights how powerful this technique is. In addition, we have seen that we can combine different type of operations. In our example, we mixed GET and POST in a single blueprint (to get the vocabulary and create the new tags).

Conclusion

Sub requests is a great way to fetch or write many resources in a single HTTP request. This allows us to improve performance significantly while maintaining almost the same flexibility that custom code provides.

Further Your Understanding

If you want to know more about the blueprint format you can read the specification. The Subrequests module comes with a JSON schema that you can use to validate your blueprint. You can find the schema here.

The hero image was downloaded from Frankenphotos and use without modifications with a CC BY 3.0 license.

Mediacurrent: Webinar Recap: Security by Design - An Introduction to Drupal Security

12. Oktober 2017 - 15:29

With cybercrime on the rise, securing data in Drupal has become a hot topic for developers and project stakeholders alike.

In our latest webinar, we were joined by three Drupal security experts from Townsend Security, Lockr and Mediacurrent who shared their approach for building a secure groundwork to protect site data in Drupal.

Dries Buytaert: The evolution of Acquia's product strategy

11. Oktober 2017 - 22:26

Four months ago, I shared that Acquia was on the verge of a shift equivalent to the decision to launch Acquia Fields and Drupal Gardens in 2008. As we entered Acquia's second decade, we outlined a goal to move from content management to data-driven customer journeys. Today, Acquia announced two new products that support this mission: Acquia Journey and Acquia Digital Asset Manager (DAM).

Last year on my blog, I shared a video that demonstrated what is possible with cross-channel user experiences and Drupal. We showed a sample supermarket chain called Gourmet Market. Gourmet Market wants its customers to not only shop online using its website, but to also use Amazon Echo or push notifications to do business with them. The Gourmet Market prototype showed an omnichannel customer experience that is both online and offline, in store and at home, and across multiple digital touchpoints. The Gourmet Market demo video was real, but required manual development and lacked easy customization. Today, the launch of Acquia Journey and Acquia DAM makes building these kind of customer experiences a lot easier. It marks an important milestone in Acquia's history, as it will accelerate our transition from content management to data-driven customer journeys.

Introducing Acquia Journey

I've written a great deal about the Big Reverse of the Web, which describes the transition from "pull-based" delivery of the web, meaning we visit websites, to a "push-based" delivery, meaning the web comes to us. The Big Reverse forces a major re-architecture of the web to bring the right information, to the right person, at the right time, in the right context.

The Big Reserve also ushers in the shift from B2C to B2One, where organizations develop a one-to-one relationship with their customers, and contextual and personalized interactions are the norm. In the future, every organization will have to rethink how it interacts with customers.

Successfully delivering a B2One experience requires an understanding of your user's journey and matching the right information or service to the user's context. This alone is no easy feat, and many marketers and other digital experience builders often get frustrated with the challenge of rebuilding customer experiences. For example, although organizations can create brilliant campaigns and high-value content, it's difficult to effectively disseminate marketing efforts across multiple channels. When channels, data and marketing software act in different silos, it's nearly impossible to build a seamless customer experience. The inability to connect customer profiles and journey maps with various marketing tools can result in unsatisfied customers, failed conversion rates, and unrealized growth.

Acquia Journey delivers on this challenge by enabling marketers to build data-driven customer journeys. It allows marketers to easily map, assemble, orchestrate and manage customer experiences like the one we showed in our Gourmet Market prototype.

It's somewhat difficult to explain Acquia Journey in words — probably similar to trying to explain what a content management system does to someone who has never used one before. Acquia Journey provides a single interface to define and evaluate customer journeys across multiple interaction points. It combines a flowchart-style journey mapping tool with unified customer profiles and an automated decision engine. Rules-based triggers and logic select and deliver the best-next action for engaging customers.

One of the strengths of Acquia Journey is that it integrates many different technologies, from marketing and advertising technologies to CRM tools and commerce platforms. This makes it possible to quickly assemble powerful and complex customer journeys.

Acquia Journey will simplify how organizations deliver the "best next experience" for the customer. Providing users with the experience they not only want, but expect will increase conversion rates, grow brand awareness, and accelerate revenue. The ability for organizations to build more relevant user experiences not only aligns with our customers' needs but will enable them to make the biggest impact possible for their customers.

Acquia's evolving product offering also puts control of user data and experience back in the hands of the organization, instead of walled gardens. This is a step toward uniting the Open Web.

Introducing Acquia Digital Asset Manager (DAM)

Digital asset management systems have been around for a long time, and were originally hosted through on-premise servers. Today, most organizations have abandoned on-premise or do-it-yourself DAM solutions. After listening to our customers, it became clear that large organizations are seeking a digital asset management solution that centralizes control of creative assets for the entire company.

Many organizations lack a single-source of truth when it comes to managing digital assets. This challenge has been amplified as the number of assets has rapidly increased in a world with more devices, more channels, more campaigns, and more personalized and contextualized experiences. Acquia DAM provides a centralized repository for managing all rich media assets, including photos, videos, PDFs, and other corporate documents. Creative and marketing teams can upload and manage files in Acquia DAM, which can then be shared across the organization. Graphic designers, marketers and web managers all have a hand in translating creative concepts into experiences for their customers. With Acquia DAM, every team can rely on one dedicated application to gather requirements, share drafts, consolidate feedback and collect approvals for high-value marketing assets.

On top of Drupal's asset and media management capabilities, Acquia DAM provides various specialized functionality, such as automatic transcoding of assets upon download, image and video mark-up during approval workflows, and automated tagging for images using machine learning and image recognition.

By using a drag-and-drop interface on Acquia DAM, employees can easily publish approved assets in addition to searching the repository for what they need.

Acquia DAM seamlessly integrates with both Drupal 7 and Drupal 8 (using Drupal's "media entities"). In addition to Drupal, Acquia DAM is built to integrate with the entirety of the Acquia Platform. This includes Acquia Lift and Acquia Journey, which means that any asset managed in the Acquia DAM repository can be utilized to create personalized experiences across multiple Drupal sites. Additionally, through a REST API, Acquia DAM can also be integrated with other marketing technologies. For example, Acquia DAM supports designers with a plug in to Adobe Creative Cloud, which integrates with Photoshop, InDesign and Illustrator.

Acquia's roadmap to data-driven customer journeys

Throughout Acquia's first decade, we've been primarily focused on providing our customers with the tools and services necessary to scale and succeed with content management. We've been very successful with helping our customers scale and manage Drupal and cloud solutions. Drupal will remain a critical component to our customer's success, and we will continue to honor our history as committed supporters of open source, in addition to investing in Drupal's future.

However, many of our customers need more than content management to be digital winners. The ability to orchestrate customer experiences using content, user data, decisioning systems, analytics and more will be essential to an organization's success in the future. Acquia Journey and Acquia DAM will remove the complexity from how organizations build modern digital experiences and customer journeys. We believe that expanding our platform will be good not only for Acquia, but for our partners, the Drupal community, and our customers.

mark.ie: Drupal Camp Dublin is Next Week - Last Chance for Tickets

11. Oktober 2017 - 20:44
Drupal Camp Dublin is Next Week - Last Chance for Tickets

Seems like just yesterday since we held DrupalCon in Dublin, now we're back with our annual Drupal Camp Dublin.

markconroy Wed, 10/11/2017 - 19:44

This year's Drupal Camp Dublin has a great line up of speakers from Ireland and abroad, covering such topics as:

  • Building multi-lingual, multi-region websites (Stella Power)
  • Working as a developer with attention-deficit disorder - add (Levi Govaerts)
  • Planning for disruptions (Jochen Lillich)
  • Migrating from Drupal 4 to 5 to 6 to 7 to 8 (Alan Burke)
  • Automating deployments (Luis Rodriguez)
  • Working webform and commerce and paragraphs and display suites and more (Chandeep Khosa)
  • Live debugging a site that's giving issues (Anthony Lindsay)
  • Deploy with Fabric, and test driven development (Oliver Davies)
  • Design in the Browser (yours truly, me, Mark Conroy)
  • Teaching web development at third level (Ruairi O'Reilly)
  • The QA process (Daniel Shaw)
  • Getting started with Docker (Ed Crompton)
  • The new theme coming to Drupal core (Mark Conroy)

And then there's some socials, and our Drupal Ireland AGM, and at least one other talk not announced yet, and ... you get the idea.

The full schedule is available on our website. There are some tickets left (only €20), get them before they are all gone.

myDropWizard.com: Drupal 6 version of netFORUM Authentication not affected by SA-CONTRIB-2017-077

11. Oktober 2017 - 20:37

Today, there was a Moderately Critical security advisory for an Access Bypass vulnerability in the netFORUM Authentication module for Drupal 7:

netFORUM Authentication - Moderately critical - Access Bypass - SA-CONTRIB-2017-077

The module was bypassing protections on the Drupal 7 user login form, to deter brute force attempts to login to the site, and so was an Access Bypass vulnerability by making login less secure when using this module.

However, Drupal 6 (including Pressflow 6) don't have these same protections for the user login form, and so, using this module is no less secure than using vanilla Drupal 6. Of course, these protections could be added to this module, and while this would be great security hardening, this doesn't represent a vulnerability - only a weakness which is also present (and widely known) in Drupal 6 core.

If you'd like all your Drupal 6 modules to receive security updates and have the fixes deployed the same day they're released, please check out our D6LTS plans.

Note: if you use the myDropWizard module (totally free!), you'll be alerted to these and any future security updates, and will be able to use drush to install them (even though they won't necessarily have a release on Drupal.org).

Mediacurrent: Mediacurrent Wins Three Nominations at 2017 Acquia Engage Conference

11. Oktober 2017 - 19:21

Mediacurrent has been selected as finalists for the 2017 Acquia Engage Awards in the categories of Financial Services, Travel and Tourism, and Digital Experience. These awards recognize the amazing sites and digital experiences that leading digital agencies are building with the Acquia Platform.

Drupal blog: Drupal looking to adopt React

11. Oktober 2017 - 19:05

This blog has been re-posted with permission from Dries Buytaert's blog. Please leave your comments on the original post.

Last week at DrupalCon Vienna, I proposed adding a modern JavaScript framework to Drupal core. After the keynote, I met with core committers, framework managers, JavaScript subsystem maintainers, and JavaScript experts in the Drupal community to discuss next steps. In this blog post, I look back on how things have evolved, since the last time we explored adding a new JavaScript framework to Drupal core two years ago, and what we believe are the next steps after DrupalCon Vienna.

As a group, we agreed that we had learned a lot from watching the JavaScript community grow and change since our initial exploration. We agreed that today, React would be the most promising option given its expansive adoption by developers, its unopinionated and component-based nature, and its well-suitedness to building new Drupal interfaces in an incremental way. Today, I'm formally proposing that the Drupal community adopt React, after discussion and experimentation has taken place.

Two years ago, it was premature to pick a JavaScript framework

Three years ago, I developed several convictions related to "headless Drupal" or "decoupled Drupal". I believed that:

  1. More and more organizations wanted a headless Drupal so they can use a modern JavaScript framework to build application-like experiences.
  2. Drupal's authoring and site building experience could be improved by using a more modern JavaScript framework.
  3. JavaScript and Node.js were going to take the world by storm and that we would be smart to increase the amount of JavaScript expertise in our community.

(For the purposes of this blog post, I use the term "framework" to include both full MV* frameworks such as Angular, and also view-only libraries such as React combined piecemeal with additional libraries for managing routing, states, etc.)

By September 2015, I had built up enough conviction to write several long blog posts about these views (post 1, post 2, post 3). I felt we could accomplish all three things by adding a JavaScript framework to Drupal core. After careful analysis, I recommended that we consider React, Ember and Angular. My first choice was Ember, because I had concerns about a patent clause in Facebook's open-source license (since removed) and because Angular 2 was not yet in a stable release.

At the time, the Drupal community didn't like the idea of picking a JavaScript framework. The overwhelming reactions were these: it's too early to tell which JavaScript framework is going to win, the risk of picking the wrong JavaScript framework is too big, picking a single framework would cause us to lose users that favor other frameworks, etc. In addition, there were a lot of different preferences for a wide variety of JavaScript frameworks. While I'd have preferred to make a bold move, the community's concerns were valid.

Focusing on Drupal's web services instead

By May of 2016, after listening to the community, I changed my approach; instead of adding a specific JavaScript framework to Drupal, I decided we should double down on improving Drupal's web service APIs. Instead of being opinionated about what JavaScript framework to use, we would allow people to use their JavaScript framework of choice.

I did a deep dive on the state of Drupal's web services in early 2016 and helped define various next steps (post 1, post 2, post 3). I asked a few of the OCTO team members to focus on improving Drupal 8's web services APIs; funded improvements to Drupal core's REST API, as well as JSON API, GraphQL and OpenAPI; supported the creation of Waterwheel projects to help bootstrap an ecosystem of JavaScript front-end integrations; and most recently supported the development of Reservoir, a Drupal distribution for headless Drupal. There is also a lot of innovation coming from the community with lots of work on the Contenta distribution, JSON API, GraphQL, and more.

The end result? Drupal's web service APIs have progressed significantly the past year. Ed Faulkner of Ember told us: "I'm impressed by how fast Drupal made lots of progress with its REST API and the JSON API contrib module!". It's a good sign when a core maintainer of one of the leading JavaScript frameworks acknowledges Drupal's progress.

The current state of JavaScript in Drupal

Looking back, I'm glad we decided to focus first on improving Drupal's web services APIs; we discovered that there was a lot of work left to stabilize them. Cleanly integrating a JavaScript framework with Drupal would have been challenging 18 months ago. While there is still more work to be done, Drupal 8's available web service APIs have matured significantly.

Furthermore, by not committing to a specific framework, we are seeing Drupal developers explore a range of JavaScript frameworks and members of multiple JavaScript framework communities consuming Drupal's web services. I've seen Drupal 8 used as a content repository behind Angular, Ember, React, Vue, and other JavaScript frameworks. Very cool!

There is a lot to like about how Drupal's web service APIs matured and how we've seen Drupal integrated with a variety of different frameworks. But there is also no denying that not having a JavaScript framework in core came with certain tradeoffs:

  1. It created a barrier for significantly leveling up the Drupal community's JavaScript skills. In my opinion, we still lack sufficient JavaScript expertise among Drupal core contributors. While we do have JavaScript experts working hard to maintain and improve our existing JavaScript code, I would love to see more experts join that team.
  2. It made it harder to accelerate certain improvements to Drupal's authoring and site building experience.
  3. It made it harder to demonstrate how new best practices and certain JavaScript approaches could be leveraged and extended by core and contributed modules to create new Drupal features.

One trend we are now seeing is that traditional MV* frameworks are giving way to component libraries; most people seem to want a way to compose interfaces and interactions with reusable components (e.g. libraries like React, Vue, Polymer, and Glimmer) rather than use a framework with a heavy focus on MV* workflows (e.g. frameworks like Angular and Ember). This means that my original recommendation of Ember needs to be revisited.

Several years later, we still don't know what JavaScript framework will win, if any, and I'm willing to bet that waiting two more years won't give us any more clarity. JavaScript frameworks will continue to evolve and take new shapes. Picking a single one will always be difficult and to some degree "premature". That said, I see React having the most momentum today.

My recommendations at DrupalCon Vienna

Given that it's been almost two years since I last suggested adding a JavaScript framework to core, I decided to talk bring the topic back in my DrupalCon Vienna keynote presentation. Prior to my keynote, there had been some renewed excitement and momentum behind the idea. Two years later, here is what I recommended we should do next:

  • Invest more in Drupal's API-first initiative. In 2017, there is no denying that decoupled architectures and headless Drupal will be a big part of our future. We need to keep investing in Drupal's web service APIs. At a minimum, we should expand Drupal's web service APIs and standardize on JSON API. Separately, we need to examine how to give API consumers more access to and control over Drupal's capabilities.
  • Embrace all JavaScript frameworks for building Drupal-powered applications. We should give developers the flexibility to use their JavaScript framework of choice when building front-end applications on top of Drupal — so they can use the right tool for the job. The fact that you can front Drupal with Ember, Angular, Vue, React, and others is a great feature. We should also invest in expanding the Waterwheel ecosystem so we have SDKs and references for all these frameworks.
  • Pick a framework for Drupal's own administrative user interfaces. Drupal should pick a JavaScript framework for its own administrative interface. I'm not suggesting we abandon our stable base of PHP code; I'm just suggesting that we leverage JavaScript for the things that JavaScript is great at by moving relevant parts of our code from PHP to JavaScript. Specifically, Drupal's authoring and site building experience could benefit from user experience improvements. A JavaScript framework could make our content modeling, content listing, and configuration tools faster and more application-like by using instantaneous feedback rather than submitting form after form. Furthermore, using a decoupled administrative interface would allow us to dogfood our own web service APIs.
  • Let's start small by redesigning and rebuilding one or two features. Instead of rewriting the entirety of Drupal's administrative user interfaces, let's pick one or two features, and rewrite their UIs using a preselected JavaScript framework. This allows us to learn more about the pros and cons, allows us to dogfood some of our own APIs, and if we ultimately need to switch to another JavaScript framework or approach, it won't be very painful to rewrite or roll the changes back.
Selecting a JavaScript framework for Drupal's administrative UIs

In my keynote, I proposed a new strategic initiative to test and research how Drupal's administrative UX could be improved by using a JavaScript framework. The feedback was very positive.

As a first step, we have to choose which JavaScript framework will be used as part of the research. Following the keynote, we had several meetings at DrupalCon Vienna to discuss the proposed initiative with core committers, all of the JavaScript subsystem maintainers, as well as developers with real-world experience building decoupled applications using Drupal's APIs.

There was unanimous agreement that:

  1. Adding a JavaScript framework to Drupal core is a good idea.
  2. We want to have sufficient real-use experience to make a final decision prior to 8.6.0's development period (Q1 2018). To start, the Watchdog page would be the least intrusive interface to rebuild and would give us important insights before kicking off work on more complex interfaces.
  3. While a few people named alternative options, React was our preferred option, by far, due to its high degree of adoption, component-based and unopinionated nature, and its potential to make Drupal developers' skills more future-proof.
  4. This adoption should be carried out in a limited and incremental way so that the decision is easily reversible if better approaches come later on.

We created an issue on the Drupal core queue to discuss this more.

Conclusion

Drupal should support a variety of JavaScript libraries on the user-facing front end while relying on a single shared framework as a standard across Drupal administrative interfaces.

In short, I continue to believe that adopting more JavaScript is important for the future of Drupal. My original recommendation to include a modern JavaScript framework (or JavaScript libraries) for Drupal's administrative user interfaces still stands. I believe we should allow developers to use their JavaScript framework of choice to build front-end applications on top of Drupal and that we can start small with one or two administrative user interfaces.

After meeting with core maintainers, JavaScript subsystem maintainers, and framework managers at DrupalCon Vienna, I believe that React is the right direction to move for Drupal's administrative interfaces, but we encourage everyone in the community to discuss our recommendation. Doing so would allow us to make Drupal easier to use for site builders and content creators in an incremental and reversible way, keep Drupal developers' skills relevant in an increasingly JavaScript-driven world, move us ahead with modern tools for building user interfaces.

Special thanks to Preston So for contributions to this blog post and to Matt Grill, Wim Leers, Jason Enter, Gábor Hojtsy, and Alex Bronstein for their feedback during the writing process.

Colorfield: The Hitchhiker's Guide to the Planet Drupal

10. Oktober 2017 - 22:24
The Hitchhiker's Guide to the Planet Drupal christophe Tue, 10/10/2017 - 22:24 In this newcomer guide, you will find:
  • How to accelerate the onboarding process and how to get a fresh Drupal 8 install, for testing.
  • The documentation reduced to the essential for the following topics: tools, projects, Drupal concepts and drupalisms, main events, contribution and service providers.
  • A brief comparison of other solutions, and when to use Drupal.

There are at least 42 reasons to onboard drupalship.org!

Acro Media: Video: Integrating Payment Gateways in Drupal Commerce 2.x is a Snap!

10. Oktober 2017 - 22:00

 

 

To say that payment gateways are much improved in Commerce 2.x is a bit of an understatement. The process of implementing a payment gateway has been cut down to about a third of the time, with more functionality rather than less.

Elevated Third: 5 Ways Web Development Project Management Will Make Your Project More Successful

10. Oktober 2017 - 18:45
5 Ways Web Development Project Management Will Make Your Project More Successful 5 Ways Web Development Project Management Will Make Your Project More Successful Lily Berman Tue, 10/10/2017 - 10:45

As account managers at Elevated Third, we manage many projects across our accounts. Web development project management is intangible though not unimportant. We do not create wireframes or write code, so our direct impact on the Drupal websites Elevated Third produces may be less clear to our clients.

During the sales process, some clients see their communication budget as an unnecessary expense. Similar to limiting overhead spending when choosing recipients for charitable organizations, limiting the communication budget means more time goes to execution, right?

Maybe not. In the same way that a successful benefit event can dramatically increase the funds available for a nonprofit’s mission, strong account management directly contributes to our clients achieving their business goals across projects.

So, how does an account manager foster a successful Drupal project at Elevated Third?

  1) Account managers are the single consistent knowledge holder throughout the life of a project

Our team’s level of involvement will vary throughout a project. While UX has a large impact in the beginning, developers complete the majority of the tasks at the end. The account manager is the only member of the team who is in every meeting from kickoff to launch. It can be frustrating (and often expensive) for clients when the team veers from their vision. As a consistent project knowledge holder, an account manager can guide the team to ensure that they are considering the big picture, even when the client is not in the room.

For Instance: A designer knows he needs to create visual design for the project. He reviews what he believes is the necessary documentation, but did not see the client’s email update describing her new brand direction. He spends hours designing with the original brand guidelines in mind, then presents it to the client. The client is then frustrated that her feedback was not implemented and additional hours will be needed to modify the design. As our contracts are time and materials, every additional hour spent on a project has a corresponding cost to our clients.

When an account manager is involved in a project, she is part of every conversation and reviews every client email. This means no feedback will get lost in translation and costly adjustments will be avoided. Account managers are not responsible for creating any element of the website, so we can focus on ensuring that our clients and end users are kept in mind in every meeting and throughout the whole project.

  2) Account managers keep budget and timeline top of mind

A core part of the account manager’s role is managing the client’s budget and timeline. No other member of the team has that responsibility. We balance designers and developers who, if given a chance, would often prefer to build the most beautiful, perfect user-friendly functionality. Their desire to build the best thing ever is valuable, but it has to be balanced with the client’s budget and timeline needs. The account manager sets deadlines and monitors burndown throughout the project. From early discussions of which features will be prioritized to consistent check-ins and tweaks throughout execution, account managers ensure that the project aligns with the established constraints.

For instance: A UX strategist, excited about how valuable the tool we are building will be for its users, starts planning her user testing. She creates a first round of prototypes and tests with five users. Their feedback is so beneficial, she creates another iteration of prototypes to test with another five users, and then tests a third. Although she has gained valuable insight, she has now used half of the project hours that were allocated for visual design, as the budget did not accommodate extensive user testing. When an account manager takes on the role of web development project management, she knows the scope and the hours that are allocated for each task. She completes a variety of checks and balances to ensure the execution aligns with the project constraints.

  3) Account managers communicate with clients and with the team

Custom web development can often be a mysterious and complex process. Luckily, an account manager has learned to translate jargon for our clients. As a result of working in this industry, we understand the terminology used along with the impact of the choices we are asking our clients to make. Not only do we coordinate meetings and send status updates to keep clients in the loop, but we are also uniquely equipped to ensure they understand the process. This means that our team can stay focused on their tasks and more efficiently complete work with minimal interruptions.

For Instance: A developer has spent an hour working on a very complex task. Knowing that he needs to maximize concentration and minimize interruptions, he silences all of his notifications. This practice, called going “heads down,” is common when tackling problem-solving tasks. During this time, a client reaches out with an extremely urgent issue. Since he is the only person available to answer her request, it lingers for hours before she receives a response. For some development-related issues, especially on a live site, this delay can dramatically impact the client’s bottom line. When an account manager is involved in the project, she can immediately alert the developer of the request and let the client know her concern is being addressed right away.

  4) Account managers are organization wizards

For all projects, but especially for complex projects, there can be a lot of documentation. Luckily, account managers choose this field because we love organizing chaos. This skill helps our team work faster throughout the course of a project. Although a client rarely sees our organization and management of tasks and documentation, they will see the benefits of more accurate work and increased efficiency across teams.

For Instance: A developer knows that she needs to reference a particular piece of documentation for the element of the site she is building today, but she can’t find it. She spends 15 minutes digging through folders to find what she needs, which seems to happen every time she completes a task. When an account manager is involved in a project, she knows what documentation the developer will need, so she has already attached it to the current task, saving the developer time.

 

5) Account managers are flexible and adapt their skills to maximize their value

Every other role on a project is clear. A UX strategist helps to define which features will best achieve the business goals and how to maximize a user’s experience of interacting with them. A designer crafts how they will appear. A developer builds them. An account manager’s role in web development project management is less clear. When people ask me what I do on a typical day, my answer often comes after a long pause, and it’s rarely the same. Many others in my field find it difficult to describe their role succinctly, as our work can vary dramatically from day to day and from project to project.

For instance: Some days, my role is quite technical, and I am preparing or reviewing project documentation or checking the quality of completed development tasks. Other days, my role is more interpersonal, and I am supporting my team in delivering their best work or in back-to-back meetings with my clients. With each project comes a new business to learn, often along with new technologies and additional nuance to my role. To be successful, I am always switching between the various priorities outlined here, along with many more.

 

At Elevated Third, we value our clients’ investment in our work and are always evolving to maximize the value of that investment. We build communication time into our projects because we know how invaluable strong account managers are to ensuring our Drupal websites generate the outcomes our clients value most.

Zivtech: Drupal is Not Just for Your Marketing Site

10. Oktober 2017 - 16:51

As a Drupal expert, many of the projects I’ve done over the years have been marketing websites. Drupal is widely understood as a content management system that’s used to power sites like ours, but this is actually only the tip of the iceberg of what Drupal can do for an organization. Our team has used Drupal to build a variety of complex custom web applications that help companies work more efficiently.

Do you need an intranet?

We’ve used Drupal to build intranets that securely keep internal content and documents for staff eyes only. Drupal has an abundance of community features that make it easy to have wikis, commenting, user profiles, and messaging. Many organizations we’ve worked with integrate their intranet with their LDAP or other Single Sign On system. 

Radial's intranet allows team members to quickly locate information about co-workers

We’ve also used Drupal for our own intranet for the past eight years. Our intranet helps keep our internal knowledge base easy to access and organizes information like our servers, sites, clients, and projects.

Read more

Drupal Modules: The One Percent: Drupal Modules: The One Percent —Content connected (video tutorial)

10. Oktober 2017 - 14:32
Drupal Modules: The One Percent —Content connected (video tutorial) NonProfit Tue, 10/10/2017 - 07:32 Episode 39

Here is where we seek to bring awareness to Drupal modules running on less than 1% of reporting sites. Today let's consider Content connected, a module which displays where content has been referenced.

ADCI Solutions: Web Designers methods and tools for enhancing a workflow

10. Oktober 2017 - 13:28

Designers do love order, so don’t believe in stereotypes. Our Drupal team’s designer created her own approach of the working files organization. It helps her to communicate with the rest of the team - developers and managers - efficiently.

 

Try out our approach and improve the workflow

 

 

Aten Design Group: Form and View Modes vs. Field Access in Drupal 8

10. Oktober 2017 - 9:13

Drupal 8 advertised many new, promising features after its release. One of the exciting new changes was the addition of form modes. Form modes promised to let you manage the content entry side of your site just as you often managed content display with view modes. This change seemed like it would eliminate the need for much of the custom and repetitive code I often needed to write inside a hook_form_alter.

Over time, I've realized that form modes aren't everything I had hoped they would be. While it's easy to create new form modes, it's literally impossible to use them without custom code or contributed modules. Drupal simply doesn't have a way to know when to use one form mode over another. Should it be based on role? Permissions? A field on the node? Content moderation state? There are contributed modules for most if not all of these, but nothing out-of-the-box.

This forced me to think about why I needed a form mode in the first place. Almost always, the answer was to disable or hide a field from a user because that user shouldn't be allowed to change that field. The same was also often true of my view modes (only to a lesser extent). I realized that this particular problem is not one of user experience, but of access control.

Drupal 8 has hook_entity_field_access(). This hook is called for every field for the specified entity type when the entity is viewed or when its form is shown. When you deny access to a field, either for viewing or editing, that field will not be shown to the user. In any scenario. This should be your preferred method for hiding fields that certain users should not be able to access.

Using field access over form and view modes to hide fields when a user should not be allowed to see or edit a field is the secure and "Drupal way" to do things. This prevents mistakes in configuration, which might accidentally leak field information via teasers, searches, and Views. It also future proofs your site. If you ever turn on REST or JSON API or add a new form or view mode down the line, you can never accidentally expose a field that needs to be kept private.

Best of all, using the field access hook is much easier to implement than all the hoops you'll have to jump through to get the right form modes displayed at the right times.

How to use hook_entity_field_access()

First, make a custom module in the standard way. Create a .module file and create the following function:

<?php use Drupal\Core\Access\AccessResult; use Drupal\Core\Field\FieldDefinitionInterface; use Drupal\Core\Session\AccountInterface; use Drupal\Core\Field\FieldItemListInterface;     /** * Implements hook_entity_field_access(). */ function yourmodule_entity_field_access($operation, FieldDefinitionInterface $field_definition, AccountInterface $account, FieldItemListInterface $items = NULL) { } ?>

From this hook, you should always return an AccessResult. By default, you should simply return a neutral access result. That is, your hook is not concerned with actually allowing or preventing access yet. Add the following to your function.

<?php function yourmodule_entity_field_access($operation, FieldDefinitionInterface $field_definition, AccountInterface $account, FieldItemListInterface $items = NULL) { $result = AccessResult::neutral(); if ($field_definition->getName() == 'field_we_care_about') { if (/* a condition we'll write later... */) { $result = AccessResult::forbidden(); } } return $result; } ?>

The above code will deny access when our still unwritten condition is true, in every other case, we're just saying "we don't care".

There's an infinite number of scenarios in which you might want to deny access, but let's say that we want to make a field only editable by an administrator. We would add the following:

<?php function yourmodule_node_field_access($operation, FieldDefinitionInterface $field_definition, AccountInterface $account, FieldItemListInterface $items = NULL) { $result = AccessResult::neutral(); if ($field_definition->getName() == 'field_we_care_about') { if ($op == 'update' && !in_array('administator', $account->getRoles())) { $result = AccessResult::forbidden(); } } return $result->addCacheContexts(['user.role:administrator']); } ?>

Now, for every user without the administrator role that attempts to update field_we_care_about, the field will not be accessible. This works for more than just forms. For example, if we had the REST module installed, this would block the user from updating the field in that way as well.

The last part to note is that we added a cache context to our AccessResult. This ensures that our access decision is only relevant when the current user does or does not have the 'administrator' role. It's important to understand that we added the cache context both when we did and when we did not deny access. If we had just added the context when we denied access, if a user with the 'administrator' role happened to be the first person to attempt to access the field, then that result would be cached for all users no matter what.

Appnovation Technologies: Appnovator Spotlight: Janice Cheer

10. Oktober 2017 - 9:00
Appnovator Spotlight: Janice Cheer Meet Janice Cheer, Sales Enablement from Vancouver, BC. 1. Who are you? What's your story? /*-->*/ /*-->*/ /*-->*/ I’m a Chinese-Canadian who grew up in a small town called Squamish. After finishing high school, I moved out to Vancouver for school and obtained my Bachelor’s Degree in Business Administration with a minor in Marketing. Sin...

Hook 42: September A11Y (Accessibility) Talk Review

10. Oktober 2017 - 3:20

We have all heard about website accessibility and know what it means in a broad sense, but what does website accessibility look like in a practical sense?

This month’s A11Y Talk featured Scott O'Hara from The Paciello Group. In this A11Y Talk, Scott O'Hara addressed questions like:
- How do I get started in a11y?
- How do I get my team to care about it?
- Where does one start in trying to incorporate a11y into the work they or their team produce?
- Who is in charge of a11y at your company anyway?

Palantir: Drupal 8 is Great for Showing Solutions Quickly

9. Oktober 2017 - 18:45
Drupal 8 is Great for Showing Solutions Quickly #D8isGr8 brandt Mon, 10/09/2017 - 11:45 Luke Wertz Oct 9, 2017

The #D8isGr8 blog series will focus on why we love Drupal 8 and how it provides solutions for our clients. This post in the series comes from Luke Wertz, Solution Architect.

We want to make your project a success.

Let's Chat.

We often work on projects with clients who are juggling strict timelines and multiple stakeholders. From the time a vendor is selected, to contract signing and project kick-off meetings, it can sometimes be a whole month before our production team is able to really dig into a new project.

The thing I love about Drupal 8 is that it gives us the ability to skip parts of the prototyping phase and get into rapid proof of concept work very quickly. We can quickly demonstrate to our clients the problem space they’re working in and a potential solution. Drupal 8 allows us to get there quickly without writing a lot of code, which means our client product owners are able to show progress to their stakeholders sooner.

This proof of concept work is enabled by the functionality that is now baked into Drupal 8 core. In previous versions of Drupal, Views was a contrib module. A lot of how Views functions in Drupal 8 is the same as before, but that extra step of having to install, deploy, and configure it has been removed.

The ability to show value to a client early and quickly is reflective of Palantir’s move to Agile development. We are a data-driven company, and we like to use quantitative methods to prove our value to our clients. Drupal 8 helps us to iterate rapidly: have an idea, quickly show how it might work, test it, and prove it.

Stay connected with the latest news on web strategy, design, and development.

Sign up for our newsletter.

aleksip.net: Reasons for choosing standards-based technologies

9. Oktober 2017 - 17:12
The recent announcement that Drupal is looking to adopt React has inspired me to live up to my Twitter bio, and be an active advocate for open standards-based technologies. While my knee-jerk reaction to the announcement was to focus on React, this blog post approaches the topic of adopting technologies in a more general manner, while still aiming to contribute to the current front-end framework discussion.

Valuebound: Step-by-step guide to Foundation framework to develop responsive web applications

9. Oktober 2017 - 14:55

Creating responsive websites have always remained a challenge for many even I faced similar difficulties in the beginning. Recently, our team came across a situation where we had to design a responsive and beautiful website in Drupal 8 for a media and publishing firm. In order to create such an amazing site, we came up with an idea to use Foundation Framework and yes! it worked.

I have written this blog to help anyone having difficulty in understanding the Foundation framework to develop responsive websites as it is a rising market demand. My idea is that this article will be a "living laboratory" to help you in understanding Foundation from the scratch. The post comprises an intro of Foundation, its features, comparison…

Amazee Labs: Tour de DrupAlps - My Amazee Extreme Challenge Recap

9. Oktober 2017 - 13:00
Tour de DrupAlps - My Amazee Extreme Challenge Recap

Recently, I took a month off to do the Amazee Extreme Challenge: after 3 years, each of us gets the opportunity to do something we would like to challenge ourselves with. In 31 days, I cycled from Switzerland over the Alps to DrupalCon Vienna. This post is intended to reflect on how my journey went and share some of the experiences I had while riding the DrupAlps tour.

Josef Dabernig Mon, 10/09/2017 - 13:00

#DrupAlps Tour Summary:

Planning & preparations

Late 2016 I started brainstorming ideas for my extreme challenge. Initially, my plan was to hike the alps from mountain hut to mountain hut. After considering the security risk: being hiking alone in the mountains for a month, I decided to go via bicycle where at least I would be able to get help via paved roads if needed. Being a passionate cyclist, climbing the Alps was a dream for a long time already.

How did I plan out the route? Initially it was really hard to tell how much I would be able to cycle. I guessed an average of 80 kilometers per day and around 1500 meters of elevation gain should be fine. Planning out the tour was really fun - basically, I would try research the most beautiful and challenging mountain passes that you can cycle with a road bike. The quäldich site was a great resource to research challenging mountain passes and I used Strava to put together the route. Over the months of planning and during riding, the tour planning adapted flexibly. A map that compares initial versus ridden planning can be found here.

Apart from knowing where to go, I also needed to get in shape and equipped for the ride. Early 2017 I started cycling the Swiss mountain passes as soon as they opened and was getting more and more experience about which equipment and food I would need during the days.

In terms of equipment, I decided to get a race bike (Rose XEON RS-4400) which is really lightweight but still made of aluminium which I thought would be a more reliable material compared to today’s popular carbon frames. As I planned to take all my luggage with me for an entire month and in order to be able to cycle high-alpine mountain passes, I decided to get two compact bags. The Ortlieb Seat-Pack takes up to 16.5 liters and fixates behind the seat post. In addition to that, after quite some research I decided to the custom, tailor-fitted G219 Blade Frame Bag from Wanderlust Equipment. It took me a while to figure out the minimal set of cloths required to keep me dry, warm and adaptable. Weather conditions ranged from between 35 and -5 degrees, sometimes it was sunny, cloudy, windy or just rainy - but in the end, the combination of bags turned out to work out really nicely.

As I started to work 80% at the beginning of May, I was able to do weekend rides from Friday to Sunday to get used to the saddle. All the preparations were really helpful but still, there was a great deal of uncertainty as I never had ridden more than 3 days in a row and would be going to ride for an entire month.

The first week - Zürich to Italy and back to Switzerland

August 25, my bike was finally packed and I was ready to get going. After a lovely breakfast with the Amazee team in the office and with some joining remotely via Zoom, it was time to say goodbye for a month and start the journey.

Well equipped with two Rapha shirts - one from Urs for the colder days and one from my girlfriend for the hot days - I was happy to start cycling. I was slowly getting into a daily routine of taking Instagram photos, navigating using Locus map and sharing the rides on my Strava profile.

The first weekend was already packed with highlights. I joined 2000+ cyclists for the minimal version of the Alpenbrevet. I did the bronze tour which covered two passes, while some of the most eager cyclists did 5 passes with a total elevation gain of 7000 meters!

After Grimsel, Furka and Gotthardpass I had passed the alps for the first time. The upcoming days I cycled along the beautiful lakes Lago Maggiore, Lago Lugano and Lago di Como crossing the Swiss/Italian border surprisingly often. Weather conditions were perfect and it was fun to start adapting the route a bit when I had enough buffer time.

Second week - Berninapass, Stelvio, Timmelsjoch and up to Germany

After a week, the first rain was hitting just in time for a day break near St. Moritz in Engadin to relax and wait for my friend Riccardo Bessone. Riccardo was traveling day and night with busses and trains to get to this place. After a relaxed breakfast and during heavy rains outside it was time for us to take the challenge and go cycling. This quite epic ride took us from heavy rains to heavy snow falls up to Berninapass and we were happy to find shelter on our way down where we could warm up the frozen fingers at a fire place and rest a bit before we headed further via Tirano up again to Bormio.

The second day of our shared weekend we cycled the Gavia Pass. Luckily the weather was sunny again, so we could enjoy a scenery full of snow-covered mountains at an elevation of above 2600 meters.

Leaving that beautiful scenery behind us, we rode some kilometers further over Passo Tonale into the Trentino valley where I would continue my journey alone and say hi to lots of fresh apples that helped my daily need for calories.

Another highlight for sure was climbing Passo Stelvio - with 2757 meters the 2nd highest paved mountain pass of Alps. After a long day of easy riding, I decided to take the climb still around 4 pm in the afternoon and was really happy to have almost no traffic on the streets. Together with some cyclists from the UK, we arrived at the top before sunset and a bit of snow fall started, just in time to find shelter at one of the pretty much empty hotels.  

While riding I usually was pretty much alone on the roads. There was plenty of time to reflect, do some thinking but I also listened to a lot of Podcasts to feed my brain. From time to time I would meet inspiring people which did similar long-distance rides, such as this guy from Thailand that tested his bicycle after attending the Eurobike show.

Another magical moment was cycling over the Timmelsjoch from Südtirol to Austria. I was able to change my route so that I would cycle the pass already a week earlier from south up north instead of going down via the pass a week later on. On the top of the mountain, the weather changed from being sunny the whole day long into the mountain being covered by a cloud and quite some rainfalls. But it wasn’t too bad, so I kept going slowly and soon could enjoy some spectacular scenery. See the “Himmelsloch am Timmelsloch” as identified by Greg and depicted above.

Approaching the end of week 2, my road bike joined me for some off-road action. The Schrofen Pass is popular amongst mountain bikers that carry their bike over the hiking trail. Trying to avoid busy roads in that area and equipped with a much lighter bike I enjoyed carrying my road bike over the pass.

Third week - Dolomites and up to Austria

After another break at a friends wedding in Germany, it was time to cycle down to Italy again where I enjoyed some of the most beautiful scenery of the tour. Especially when the clouds disappeared and the unique peaks of the Dolomites started shining through, I know that each single investment into this entire tour was worth the effort.

After spending some days in the dolomites, I approached the last week of my tour. My uncle Wolfgang “Radlwolf” Dabernig, together with his friend Kurti would meet me in Italy from where we cycled the Plöckenpass together. While I enjoyed cycling on my own, trips together like this one with friends and family as part of my tour really where great intermediate steps of my tour as we could share the excitement about the tour and spend valuable time together.

Fourth week - Rains, Kärnten, Slovenia up to Linz / Danube

Entering Kärnten also meant the beginning of an entire week of rain. Cycling in the rain was tough but I was lucky to have all the equipment and mental health needed to survive even the hard days. As long as you keep moving you usually don’t get cold. Rain jacket, rain trousers, and overshoes plus a warm soup would save my days.

Cycling through the Alps also meant visiting different regions, different countries and getting in touch with many different cultures. I crossed borders 17 times and was happy we don’t have border controls anymore thanks to being part of Schengen Area.

It’s hard to summarize the diversity of impressions I had during the tour in a single blog post with a few pictures. What I for sure can tell is that late August to late September turned out to be a great season for cycling. While in the beginning it was really hot and I was glad to take a swim in one of the lakes, the later weeks of the tour turned out to be rather chilly. On the flipside, those early Autumn weeks made for some beautiful visual impressions.

Final stretch - Linz to Vienna

From Kärnten I would cycle up to Linz, crossing the Alps for the last time. Because of a landslide of the heavy rains, the Sölkpass was closed and I had to take a detour via Radstätter Tauernpass which turned out just fine (and snowcovered) too. As I was getting closer to my final destination, DrupalCon Vienna, I had gotten into a routine of organizing my day pack of clothes into the above-depicted bags.

On the last weekend, Ricardo joined me again to cycle along the danube. After a long day on Saturday from Linz to Krems, the last day of cycling was planned to be a relaxed one. In Tulln, we stopped for lunch at the webshapers office where we met more friends from the Drupal community. Together, we cycled the leg to Vienna and even added in two small extra passes before arriving in Vienna. At the Schweizerhaus, a group of Amazee’s, DrupalCon attendees, friends and family were meeting us and I was happy to finish the DrupAlps tour healthy and without any injuries.

Cycling the Alps for a month was an incredible experience, I can definitely recommend. I think I was happy to not have any major issues along the way and I was also really glad that the adaptive planning of the tour worked out even better than I had hoped for. Thanks so much for everyone who has helped me achieve this goal! Without all the great support that I received, the DrupAlps tour wouldn’t have been the positive experience it has been!

What’s next? I am also happy to be back at the office, starting my new role as Agile Consultant with the Amazee Labs Zürich team. In an upcoming post, I will certainly talk more about what’s going on in this area.

Thanks for reading and following my tour! For now, if you are interested, here are some more resources: