Community: Looking back at the first-ever DrupalCon Teamwork and Leadership Workshop

Planet Drupal - 6. Juni 2018 - 20:59

The Drupal Community Working Group (CWG), with support from the Drupal Association, organized and held the first-ever Teamwork and Leadership Workshop at DrupalCon Nashville on April 10, 2018. The goal of the three-hour workshop was to explore teamwork, leadership, and followership in the context of the Drupal community as well as to help provide support and resources for people in the Drupal community who work alongside others in teams and/or may find themselves in positions of responsibility or leadership. Additionally, we hoped to expand the base of people who can step into leadership positions in the Drupal community, and to help those who may already be in those positions be more effective.

The workshop was led by Drupal Association board chair Adam Goodman, who generously donated his time. Adam is the head of Northwestern University’s Center for Leadership, and he works as an executive coach and advisor to senior executives and boards of directors at dozens of companies and organizations around the world. 

As part of the planning for the workshop, Adam asked us to enlist a number of facilitators to help with the various workshop exercises. In addition to three CWG members (Jordana Fung, George Demet, and Mike Anello), the following community members also facilitated: Donna Benjamin, Shyamala Rajaram, Gábor Hojtsy, Angie Byron, and Tiffany Farriss. The facilitators met with Adam prior to the workshop to understand what would be expected of them. 

We wanted to make sure that we invited a diverse range of people to the workshop who are doing awesome work with Drupal around the world, including those whose efforts may not be as well-known or recognized (yet).  We set an internal goal of at least 50% of attendees to be from populations historically underrepresented at DrupalCon, including those who self-identify as women, non-gender binary, people of color, and/or people who are not from Europe, the United States, or Canada.. To this end, prior to the public registration period, we sent out invitations to 64 community members, 75% of whom were from an under-represented cohort. We invited people who are involved in all aspects of the community including (but not limited to) event organizers, sprint organizers, project maintainers, as well as past and current Aaron Winborn Award nominees. At the workshop, there were a total of 50 attendees (there were a total of 60 seats available), with approximately 64% from underrepresented cohorts. 

Attendees were seated at round tables of approximately 10 people per table. The first half of the workshop was focused on large group exercises that focused on helping attendees think about what it meant to be a leader and a team member. We talked about keeping perspective as team members and not jumping to conclusions about each other's behaviors based on an often (extremely) limited set of data. The second half of the workshop focused on smaller group exercises in which individuals responded to various prompts and then discussed them as a small (table-sized) group. 

A few days after the workshop, we asked the attendees to complete an 11-question follow-up survey. Of the 50 attendees, we had 17 responses for a 33% response rate. We asked what their expectations were for the workshop; representative responses included:

I thought it would be a workshop on leadership, but I was surprised by the approach to the Drupal community.

Didn't know what to expect. So...none

The fact that we had multiple responses indicating that the expectations were not clear tells us that we need to do a better job in communicating exactly what the goals and activities of the workshop will be in the future. 

On a scale of 1-5, 73% of respondents indicated that the workshop met their expectations (via a rating of 4 or 5). 

We also asked respondents to share an insight from the workshop. Responses included:

Transition planning for responsibilities you take on and having a plan in place before even taking on the responsibility.

The need to know why each person on the team is present (their motivation) and the importance of unified movement toward a goal.

I hadn't written out what leadership looked like to me before, so I found that part of the exercise to be quite helpful.

The survey also found that the attendees found more value in the smaller group exercises than the large group exercises (81.3% vs. 60%), with 81.3% indicated they'd be interested in attending future similar workshops.

Many of the open ended responses indicated that some attendees were hoping for more practical, hands-on advice for specific situations. In addition, several of the responses felt that parts of the exercises felt rushed, and wished there was more time. Finally, several attendees commented on the appropriateness of some of the imagery used in one of the workshop exercises, for which the CWG made a public apology following the event. We have gone through all of the comments relating to aspects of the event that were considered negative or unhelpful and will take this into consideration on how we can improve the workshop for the future.

Overall, we feel the workshop was a success, and something that has been long overdue for the Drupal community. We've been discussing how we can make similar content available to everyone in the community, not just DrupalCon attendees. We're open to ideas for future workshops on these topics (and format), let us know if you have any ideas.
 

"Wir machen das dicht": Apple will Tracking über Like-Buttons aushebeln

heise online Newsticker - 6. Juni 2018 - 19:30
Der Apple-Browser Safari weist künftig auf Tracking durch Share- und Like-Buttons hin und fragt um Erlaubnis. Dies richtet sich vor allem gegen Facebook.

Bilderkennung: Google Lens als eigenständige Android-App

heise online Newsticker - 6. Juni 2018 - 19:00
Google Lens steht als Stand-Alone-App im Google Play Store bereit. Die Anwendung funktioniert jedoch nicht auf allen aktuellen Android-Smartphones.

"Signalpiraterie"-Vertrag: Extraschutz für Signale von Rundfunkanbietern

heise online Newsticker - 6. Juni 2018 - 18:30
Der Zombie unter den völkerrechtlichen Verträgen zum Schutz von Rechteinhabern ist zurück. Mitglieder der WIPO empfehlen eine diplomatische Konferenz.

Drupal blog: Virtual reality on campus with Drupal

Planet Drupal - 6. Juni 2018 - 18:16

This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post.

One of the most stressful experiences for students is the process of choosing the right university. Researching various colleges and universities can be overwhelming, especially when students don't have the luxury of visiting different campuses in person.

At Acquia Labs, we wanted to remove some of the complexity and stress from this process, by making campus tours more accessible through virtual reality. During my presentation at Acquia Engage Europe yesterday, I shared how organizations can use virtual reality to build cross-channel experiences. People that attended Acquia Engage Europe asked if they could have a copy of my video, so I decided to share it on my blog.

The demo video below features a high school student, Jordan, who is interested in learning more about Massachusetts State University (a fictional university). From the comfort of his couch, Jordan is able to take a virtual tour directly from the university's website. After placing his phone in a VR headset, Jordan can move around the university campus, explore buildings, and view program resources, videos, and pictures within the context of his tour.

All of the content and media featured in the VR tour is stored in the Massachusetts State University's Drupal site. Site administrators can upload media and position hotspots directly from within Drupal backend. The React frontend pulls in information from Drupal using JSON API. In the video below, Chris Hamper (Acquia) further explains how the decoupled React VR application takes advantage of new functionality available in Drupal 8.

It's exciting to see how Drupal's power and flexibility can be used beyond traditional web pages. If you are interesting in working with Acquia on virtual reality applications, don't hesitate to contact the Acquia Labs team.

Special thanks to Chris Hamper for building the virtual reality application, and thank you to Ash Heath, Preston So and Drew Robertson for producing the demo videos.

ZenBook Pro: Asus-Notebooks mit Touchscreen als Touchpad

heise online Newsticker - 6. Juni 2018 - 18:00
In den neuen ZenBook-Pro-Notebook dient ein kleiner Touchscreen in der Handballenablage sowohl als Touchpad als auch als zweiter Bildschirm.

Apples WWDC-Keynote 2018: Die Highlights im Video

heise online Newsticker - 6. Juni 2018 - 17:30
Apple hat am Montagabend alle vier Betriebssysteme in neuen Versionen vorgestellt. Unser Video zeigt die Zusammenfassung zu iOS 12, macOS Mojave und mehr.

EuGH: Betreiber von Facebook-Fanseiten sind für Datenschutz mitverantwortlich

heise online Newsticker - 6. Juni 2018 - 17:30
Betreiber von Fanseiten auf Facebook müssen umdenken. Laut Europäischem Gerichtshof sind sie so wie auch Facebook selbst für den Datenschutz verantwortlich.

Drupalgeddon 2: Immer noch über 115.000 Drupal-Webseiten verwundbar

heise online Newsticker - 6. Juni 2018 - 17:30
Noch immer gibt es tausende Drupal-Webseiten, die über eine kritische Lücke angreifbar sind. Sicherheitsupdates sind seit über zwei Monaten verfügbar.

Drupal Association blog: Email news after GDPR

Planet Drupal - 6. Juni 2018 - 17:09

GDPR took effect last month, and many organizations sent policy updates to your inbox. We took action on our email lists to acquire explicit consent from all subscribers. You can read about other action we took to prepare for GDPR, but this post is all about what we communicate about through the Drupal email list.

The Drupal email list had almost 64,000 subscribers receiving various newsletters from our programs, and we knew running a re-consent campaign would have an impact on the number of subscribers in each of our newsletter groups. It seemed worth the potential loss, because numbers don't always tell the full story of the impact communications can, and do, make for an organization.

There were two problems with our list that kept us from delivering tailored messages with the Drupal community: old or insufficient data on subscribers and limited newsletter options. To make the list more effective, we changed the structure of our subscriptions to focus on the type of message we're sending, rather than being aligned with Drupal Association programs.

By using Mailchimp's GDPR tools and suggestions to run a re-consent campaign, we asked for explicit consent to this new subscription structure. We enabled the Marketing Preferences section provided by Mailchimp, and now in order to get email, you must select Email within that section of the form. This has been confusing for some people. Here is an easy way to look at it: The top section, where you choose your lists, is simply choosing the topics that interest you; the bottom section, where you click “email” in marketing preferences gives us explicit permission to email you about these topics.

The impact of running the re-consent campaign has been a loss of a vast majority of our list. Is this a problem? It depends. Ideally, only people who want to read our news are now subscribed and we will see an increase in open rates. And hopefully community members won’t be hearing about updates or announcements from other sources, rather than from our newsletters first - if you do, please let us know.

Please take a moment and check that your subscription settings are how you want them. If you want to receive news, check the Email box under Marketing Preferences. (Enter your email address in this form and you'll get a message to update your subscription)

If you are in any way interested in Drupal, you don't want to miss our messages. In particular, Bob Kepford (kepford) does a fantastic job of curating content for the Drupal Weekly Newsletter. There's something for everyone, every Thursday. Likewise, the special offers messages from our partners can help you learn about and save money on services. Thanks for keeping informed!

There are some communications which are not impacted by our subscription structure change. These include security notifications, blog post notifications, Drupal.org system messages, and any transactional messages - for instance, if you register to attend DrupalCon, we will still email you about DrupalCon.

macOS: Aus für 32-Bit-Apps, OpenGL und OpenCL

heise online Newsticker - 6. Juni 2018 - 17:00
macOS 10.14 wird das letzte Mac-Betriebssystem sein, auf dem 32-Bit-Software läuft. OpenGL und OpenCL haben ebenfalls ausgedient – Apple setzt voll auf Metal.

Mars: NASA-Rover Curiosity sammelt wieder Gesteinsproben

heise online Newsticker - 6. Juni 2018 - 17:00
Mehr als ein Jahr lang konnte der Mars-Rover Curiosity keine Gesteinsproben nehmen. Mit etwas Improvisation ist ihm das nun wieder gelungen.

Lullabot: Will JavaScript Eat the Monolithic CMS?

Planet Drupal - 6. Juni 2018 - 16:59

We’ve all been hearing a lot about JavaScript eating the web, but what does that mean for traditional content management systems like Drupal and Wordpress? In many ways, it’s a fatuous claim to say that any particular language is “winning” or “eating” anything, but if a different approach to building websites becomes popular, it could affect the market share of traditional CMS platforms. This is an important topic for my company, Lullabot, as we do enterprise software projects using both Drupal, a popular CMS, and React, a popular JavaScript framework.

At our team retreat, one of Lullabot’s front-end devs, John Hannah, who publishes the JavaScriptReport, referred to Drupal as a “legacy CMS.” I was struck by that. He said that, many in the JavaScript community, PHP CMSes like Drupal, WordPress and Joomla are seen this way. What does he mean? If these “monolithic” CMS platforms are legacy, what’s going to replace them? Furthermore, PHP, the language, powers 83% of total websites and plays a large part in platforms like Facebook. That’s not going to change quickly.

Still, JavaScript’s surging popularity is undeniable. In part, this is due to its ubiquity. It’s a law-of-increasing-returns, chicken-and-the-egg, kind of thing, but it’s real. JavaScript is the only programming language that literally runs everywhere. Every web browser on every device has similar support for JavaScript. Most cloud providers support JavaScript’s server-side incarnation Node.js, and its reach extends far beyond the browser into the internet of things, drones and robots. Node.js means JavaScript can be used for jobs that used to be the sole province of server-side languages. And isomorphism means it can beat the server-side languages at their own game. Unlike server-side applications written in PHP (or Ruby or Java), an isomorphic application is one whose code (in this case, JavaScript) can run on both the server and the client. By taking advantage of the computing power available from the user’s browser, an isomorphic application might make an initial HTTP request to the Node.js server, but from there asynchronously load resources for the rest of the site. On any subsequent request, the browser can respond to the user without having the server render any HTML, which offers a more responsive user experience. JavaScript also evolved in tandem with the DOM, so while any scripting language can be used to access nodes and objects that comprise the structure of a web page, JavaScript is the DOM’s native tongue.

Furthermore, the allure of one-stack-to-rule-them-all may attract enterprises interested in consolidating their IT infrastructure. As Hannah told me, “An organization can concentrate on just supporting, training and hiring JavaScript developers. Then, that team can do a lot of different projects for you. The same people who build the web applications can turn around and build your mobile apps, using a tool like React Native. For a large organization, that’s a huge advantage.”

Node.js Takes a Bite Out of the Back-end

Tempted to consolidate to a single stack, one of Lullabot’s largest digital publishing clients, a TV network, has begun to phase out Drupal in favor of microservices written in Node.js. They still need Drupal to maintain metadata and collections, but they’re talking about moving all of their DRM (digital rights management) content to a new microservice. This also provoked my curiosity. Is the rapidly changing Node.js ecosystem ready for the enterprise? Who is guaranteeing your stack stays secure? The Node foundation? Each maintainer? Drupal has a seasoned and dedicated security team that keeps core and contrib safe. Node.js is changing fast, and the small applications written in Node.js are changing faster than that. This is a rate of change typically anathema to enterprise software. And it’s a significant shift for the enterprise in other ways, as well. While Drupal sites are built with many small modules that together create a unique application, they all live within one set of code. The Node.js community approaches the same problem by building small applications that communicate with each other over a network (known as microservices). The preferred approach of most within the Node.js community is to build applications using a microservices architecture. Does that work in the context of a major enterprise publisher? Is it maintainable? AirBnB, Paypal and Netflix have proven that it can be, but for many of our clients in the digital publishing industry, I wonder how its use within these technology companies pertains. Arguably, Amazon pioneered the modern decouple-all-the-things, service-oriented architecture, and, well, you, dear client, are not Amazon.

In this article, I’ll explore this question through examples and examine how JS technologies are changing the traditional CMS stack architecture within some of the client organizations we work with. I should disclose my own limitations as I tackle an ambitious topic. I’ve approached this exploration as a journalist and a business person, not as an engineer. I’ve tried to interview many sources and reflect the nuances of the technical distinctions they’ve made, but I do so as a technical layperson, so any inaccuracies are my own.

The Cathedral and the Bazaar

How is the JavaScript approach different than that of a CMS like Drupal? In the blogosphere, this emerging competition for the heart of the web is sometimes referred to as “monolith vs. microservice.” In many ways, it’s less a competition between languages, such as JavaScript and PHP (or Ruby or Python or Java), but more the latest chapter in the dialectic between small, encapsulated programs with abstracted APIs as compared to comprehensive, monolithic systems that try to be all things to all people.

The JavaScript microservices approach—composing a project out of npm packages—and the comparatively staid monolithic approach taken by Drupal are both descendants of open source collaboration, aka the “bazaar,” but the way things are being done in Node.js right now, as opposed to the more mature and orderly approach of the Drupal community, make Node.js the apparent successor of the bazaar. Whereas Drupal—with 411,473 lines of code in the current version of core—has cleaned up its tents and begun to look more like a cathedral, with core commits restricted to an elite priesthood.

Drupal still benefits from a thriving open-source community. According to Drupal.org, Drupal has more than 111,000 users actively contributing. Meaning, in perhaps the most essential way, it is still benefiting from Linux founder Linus Torvald’s law, “Given enough eyes, all bugs are shallow.” Moreover, the Drupal community remains the envy of free software movement, according to Google’s Steve Francia, who also guided the Docker and MongoDB communities following Drupal’s lead.

Nevertheless, JavaScript and its server-side incarnation are gaining market share for a reason. JavaScript is both accessible and approachable. Lullabot senior developer Mateu Aguiló Bosch, one of the authors of Contenta CMS, a decoupled Drupal distribution, describes the JavaScript ecosystem, as follows: “there’s an open-script vibe in the community with snippets of code available everywhere. Anyone, anywhere, can experiment with these snippets using the built-in console in their browser.” Also, Bosch continues, “Node.js brings you closer to the HTTP layer. Many languages like PHP and Ruby abstract a lot of what’s going on at that layer, so it wasn’t until I worked with Node.js that I fully understood all of the intricacies of the HTTP protocol.”

Furthermore, thanks to the transpiler Babel, JavaScript allows developers to program according to their style. For instance, a developer familiar with a more traditional nomenclature for object-oriented classes can use TypeScript and then transpile their way to working JavaScript that’s compatible with current browsers. “Proposals for PHP have to go through a rigorous process, then go to binary, then the server has to upgrade, and then apps for Drupal have to support this new version,” says Sally Young, a Lullabot who heads the Drupal JavaScript Modernization Initiative. “With transpiling in JS, we can try out experimental language features right away, making it more exciting to work in.” (There is precedence for this in the PHP community, but it’s never become a standard workflow.)

However, JavaScript’s popularity is attributable to more than just the delight it inspires among developers.

Concurrency and non-blocking IO

I spoke with some of our clients who are gradually increasing the amount of JS in their stack and reducing the role of Drupal about why they’ve chosen to do so. (I only found one who is seeking to eliminate it all together, and they haven’t been able to do so yet.) Surprisingly, it had nothing to do with Hannah’s observation that hiring developers for and maintaining a single stack would pay dividends in a large organization. This was a secondary benefit, not a primary motivation. The short answer in each case was speed, in one case speed in the request-response sense, and in the other, speed in the go-to-market sense.

Let’s look at the first example. We work with a large entertainment media company that provides digital services for a major sports league. Their primary site and responsive mobile experience are driven by Drupal 8’s presentation layer (though there’s also the usual caching and CDN magic to make it fast). Beyond the website, this publisher needs to feed data to 17 different app experiences including things like iOS, tvOS, various Android devices, Chromecast, Roku, Samsung TV, etc. Using a homegrown system (JSON API module wasn’t finished when this site was migrated to D8), this client pushes all of their content into an Elasticsearch datastore where it is indexed and available for the downstream app consumers. They built a Node.js-based API to provide the middleware between these consumers and Elasticsearch. According to a stakeholder, the group achieved “single-digit millisecond responses to any API call, making it the easiest thing in the whole stack to scale.”

This is likely in part due to one of the chief virtues of Node.js. According to Node’s about page:

As an asynchronous event-driven JavaScript runtime, Node is designed to build scalable network applications…many connections can be handled concurrently. Upon each connection, a callback is fired, but if there is no work to be done, Node will sleep. This is in contrast to today's more common concurrency model where OS threads are employed. Thread-based networking is relatively inefficient and very difficult to use.

Multiple core CPUs can handle multiple threads in direct proportion to the number of CPU cores. The OS manages these threads and can switch between them as it sees fit. Whereas PHP moves through a set of instructions top to bottom with a single pointer for where it’s at in those instructions, Node.js uses a more complex program counter that allows it to have multiple counters at a time. To roughly characterize this difference between the Node.js asynchronous event loop and the approach taken by other languages, let me offer a metaphor. Pretend for a moment that threads are cooks in the kitchen awaiting instructions. Our PHP chef needs to proceed through the recipe a step at a time: chopping vegetables, and then boiling water, and then putting on a frying pan to sauté those veggies. While the water is boiling, our PHP chef waits, or the OS makes a decision and moves on to something else. Our Node.js chef, on the other hand, can handle multi-tasking to a degree, starting the water to boil, leaving a pointer there, and then moving on to the next thing.

However, Node.js can only do this for input and output, like reading a database or fetching data over HTTP. This is what is referred to as “non-blocking IO.” And, it’s why the Node.js community can say things like, “projects that need big concurrency will choose Node (and put up with its warts) because it’s the best way to get their project done.” Asynchronous, event-driven programs are tricky. The problem is that parallelism is a hard problem in computer science and it can have unexpected results. Imagine our cook accidentally putting the onion on the stove to fry before the pan is there or the burner is lit. This is akin to trying to read the results from a database before those results are available. Other languages can do this too, but Node’s real innovation is in taking one of the easier to solve problems of concurrent programming (IO), designing it directly into the system so it’s easier to use by default, and marketing those benefits to developers who may not have been familiar with similar solutions in more heavyweight languages like Java.

Even though Node.js does well as a listener for web requests, the non-blocking IO bogs down if you’re performing CPU intensive computation. You wouldn’t use Node.js “to build a Fibonacci computation server in Node.js. In general, any CPU intensive operation annuls all the throughput benefits Node offers with its event-driven, non-blocking I/O model because any incoming requests will be blocked while the thread is occupied with your number-crunching,” writes Tomislav Capan in “Why the Hell Would You Use Node.js.” And Node.js is inherently single-threaded. If you run a Node.js application on a CPU with 8 cores, Node.js would just use one, whereas other server-side languages could make use of all of them. So Node.js is designed for lots of little concurrent tasks, like real-time updates to requests or user interactions, but bad at computationally intensive ones. As one might expect, given that, it’s not great for image processing, for instance. But it’s great for making seemingly real-time, responsive user interfaces.

JavaScript Eats the Presentation Layer

There’s an allure to building a front-end with just HTML, CSS, and JavaScript since those three elements are required anyway. In fact, one of our largest clients started moving to a microservices architecture somewhat by accident. They had a very tight front-end deadline for a new experience for one of their key shows. Given the timeline, the team decided to build a front-end experience with HTML, CSS, and JavaScript and then feed the data in via API from Drupal. They saw advantages in breaking away from the tricky business of Drupal releases, which can include downtime as various update hooks run. Creating an early decoupled (sometimes referred to as “headless”) Drupal site led them to appreciate Node.js and the power of isomorphism. As the Node.js documentation explains, “after over 20 years of stateless-web based on the stateless request-response paradigm, we finally have web applications with real-time, two-way connections, where both the client and server can initiate communication, allowing them to exchange data freely. This is in stark contrast to the typical web response paradigm, where the client always initiates communication.”

After hacking together that first site, they found React and began to take full advantage of stateful components that provide real-time, interactive UX, like AJAX but better. Data refreshes instantly, and that refresh can be caused by a user’s actions on the front-end or initiated by the server. To hear this client tell it, discovering these technologies and the virtues of Node.js led them to change the role of Drupal. “With Drupal, we were fighting scale in terms of usage, and that led us to commit to a new, microservices-oriented stack with Drupal playing a more limited role in a much larger data pipeline that utilizes a number of smaller networked programs.” These included a NoSQL data-as-a-service provider called MarkLogic, and a search service called Algolia, among others.

So what started as the need for speed-to-market caused this particular client to discover the virtues of Node.js, and then, subsequently React. Not only can libraries like React provide more app-like experiences in a browser, but tools like React Native can be used to make native apps for iOS and Android. PWAs (progressive web apps) use JavaScript to make web applications behave in certain respects like native applications when they’re opened on a mobile device or in a standard web browser. If there was ever a battle to be won in making the web more app-like, JavaScript won that contest a long time ago in the days of jQuery and Ajax. Heck, it won that battle when we all needed to learn JavaScript in the late 90s to swap navigation images using onMouseOver.

Is JavaScript taking over the presentation layer? Only for some. If you’re a small to medium-sized business, you should probably use the presentation layer provided by your CMS. If you’re a large enterprise, it depends on your use case. A good reason to decouple might be to make your APIs a first-class citizen because you have so many downstream consumers that you need to force yourself to think about your CMS as part of a data pipeline and not as a website. Moreover, if shaving milliseconds off “time to first interactive” means earning millions of dollars in mobile conversions that might otherwise have been lost, you may want to consider a JS framework or something like AMP to maximize control of your markup. Google has even built a calculator to estimate the financial impact of better load times based on the value of a conversion.

That said, there are some real disadvantages in moving away from Drupal’s presentation layer. (Not to mention it’s possible to make extensive use of JavaScript-driven interactivity within a Drupal theme.) By decoupling Drupal, you lose many of Drupal’s out-of-the-box solutions to hard problems such as request-response handling (an essential component of performance), routing, administrative layout tools, authentication, image styles and a preview system. Furthermore, you now have to write schemas for all of your APIs to document them to consumers, and you need to grapple with how to capture presentational semantics like visual hierarchy in textual data.

As Acquia CTO Dries Buytaert wrote in The Future of Decoupled Drupal,

Before decoupling, you need to ask yourself if you're ready to do without functionality usually provided for free by the CMS, such as layout and display management, content previews, user interface (UI) localization, form display, accessibility, authentication, crucial security features such as XSS (cross-site scripting) and CSRF (cross-site request forgery) protection, and last but not least, performance. Many of these have to be rewritten from scratch, or can't be implemented at all, on the client-side. For many projects, building a decoupled application or site on top of a CMS will result in a crippling loss of critical functionality or skyrocketing costs to rebuild missing features.

These things all have to be reinvented in a decoupled site. This is prohibitively expensive for small to medium-sized businesses, but for large enterprises with the resources and a predilection for lean, specific architectures, it’s a reasonable trade-off to harness the power of something like the React library fully.

JavaScript frameworks will continue to grow as consumers demand more app-like experiences on the web and that probably means the percentage of websites directly using a CMS’s presentation layer will shrink over time, but this is going to be a long-tail journey.

JavaScript Eats the Admin Interface

PHP CMSes have a decade lead in producing robust editorial experiences for managing content with a refined GUI. Both WordPress and Drupal have invested thousands of hours in user testing and refinement of their respective user interfaces. Well, wait, you say, aren’t both Drupal and WordPress trying to replace their editorial interfaces with decoupled JavaScript applications to achieve a more app-like experience? Well, yes. Moreover, Gutenberg, the new admin interface in WordPress built on React, is an astonishing evolution for the content authorship experience, a consummation devoutly to be wished. Typically, editors generate content in a third-party application before moving it over to and managing it in a CMS. Gutenberg attempts to create an authorship experience to rival that of Desktop applications typically used for this purpose. At Word Camp 2015, Matt Mullenweg issued a koan-like edict to the WordPress community “learn JavaScript, deeply.” He was preparing the way for Gutenberg.

Meanwhile, on Drupal island, the admin-ui-js team is at work building a decoupled admin interface for Drupal with React, code-named the JavaScript Modernization Initiative. In that sense, JavaScript is influencing the CMS world by beginning to eat the admin interface for two major PHP CMSes. As of this writing, neither interface was part of the current core release.

JavaScript Replaces the Monolithic CMS?

Okay, great, developers love JavaScript, JavaScript devs are easier to hire (perhaps), Node.js can handle concurrency in a more straightforward fashion than some other languages, isomorphic decoupled front-ends can be fast and provide interactivity, and the Drupal and Wordpress admin UIs are being rewritten in React, a JavaScript library, in order to make them more app like. But our original question was whether JavaScript might eventually eat the Monolithic CMS. Looking at the evidence I’ve produced so far, I think you’d have to argue that this process has begun. Perhaps a better question is what do we, the Drupal community, do about it?

A sophisticated front-end such as a single-page application, a PWA, or a React application, let’s say, still needs a data source to feed it content. And while it’s possible to make use of different services to furnish this data pipeline, editors still need a place to edit content, govern content, and manage meta-data and the relationship between different pieces of content; it’s a task to which the PHP monolithic CMS platforms are uniquely suited.

While some JavaScript CMSes have cropped up—Contentful (an API-first CMS-as-a-service platform), CosmicJS, Prismic, and ApostropheCMS— they don’t have near the feature-set or flexibility of a Drupal when it comes to managing content. As head of Drupal’s JavaScript initiative, Sally Young says, “the new JS CMSes do less than Drupal tries to do, which isn’t necessarily a bad thing, but I still think Drupal’s Field API is the best content modeling tool of any CMS by far.” And it’s more than the fact that these CMSes try to do less, it’s also an issue of maturity.

“I’m not convinced from my explorations of the JS ecosystem that NPM packages and Node.js are mature enough to build something to compete with Drupal,” says senior architect Andrew Berry. “Drupal is still relevant because it’s predicated on libraries that have 5-10 years of development, whereas in the Node world everything is thrown out every 6 months. In Drupal, we can’t always get clients to do major releases, can you imagine if we had to throw it out or change it every 6 months?” This was echoed by other experts that I spoke with.

Conclusion

The monolithic web platforms can’t rest on their laurels and continue to try to be everything to everyone. To adapt, traditional PHP CMSes are going to require strong and sensible leadership that find the best places for each of these tools to shine in conjunction with a stack that includes ever-increasing roles for JavaScript. Drupal, in particular, given its enterprise bent, should embrace its strengths as a content modeling tool in an API-first world—a world where the presentation layer is a separate concern. As Drupal’s API-First initiative lead, Bosch said at Drupalcon Nashville, “we must get into the mindset that we are truly API-first and not just API compatible.” Directing the formidable energy of the community toward this end will help us remain relevant as these changes transpire.

To get involved with the admin-ui-js initiative, start here.

To get involved with the API-first initiative, start here.

Special thanks to Lullabots Andrew Berry, Ben Chavet, John Hannah, Mateu Aguiló Bosch, Mike Herchel, and Sally Young for helping me take on a topic that was beyond my technical comfort zone.

Blair Wadman: How do you add a class to a Twig template in Drupal 8?

Planet Drupal - 6. Juni 2018 - 16:54

Adding CSS classes to templates allows you to target templates or parts of the template in your CSS files.

TEN7 Blog's Drupal Posts: Animal Humane Society Dog Habitat Kiosk

Planet Drupal - 6. Juni 2018 - 16:40

The Animal Humane Society (AHS), in Minneapolis, Minnesota is the leading animal welfare organization in the Upper Midwest, helping 25,000 dogs, cats and critters in need find loving homes each year, while providing a vast array of services to the community, from low-cost spay and neuter services to dog training and rescuing animals from neglectful and abusive situations.

Maschinenbauer: "Automatisierung ersetzt nicht den Menschen"

heise online Newsticker - 6. Juni 2018 - 16:30
Der weltweite Trend zur Automatisierung der Fertigung sorgten für anhaltende Wachstumsimpulse. Mit dem befürchteten Jobabbau rechnet die Branche nicht.

EU-Abgeordnete bekämpfen "Datenprotektionismus" im digitalen Binnenmarkt

heise online Newsticker - 6. Juni 2018 - 15:30
Das Parlament befürwortet einen Verordnungsentwurf, wonach es keine Vorgaben dafür mehr geben soll, nicht-personenbezogene Daten national vorzuhalten.

Zahlen, bitte! 12.750.000 Datensätze, einfach abgegriffen

heise online Newsticker - 6. Juni 2018 - 15:30
Das "automatisierte Auskunftsverfahren" dient dazu, Personen in der Telekommunikation zu identifizieren. Ermittelt werden dabei alle Bestandsdaten einer Person.

EU-Rechnungshof: Vectoring bremst Breitband-Ausbau in Deutschland

heise online Newsticker - 6. Juni 2018 - 15:00
Der Europäische Rechnungshof glaubt, Deutschland könne nicht sobald flächendeckend Geschwindigkeiten von bis zu einem Gigabit/s ermöglichen.

Red Hat Fuse 7: Integrationsplattform für verteilte Umgebungen

heise online Newsticker - 6. Juni 2018 - 15:00
Mit Fuse Online, Container-Images für OpenShift und Deployment via Spring Boot löst sich Red Hats Integrationsplattform von der angestammten On-Premise-Welt.