Inhalt abgleichen
Drupal.org - aggregated feeds in category Planet Drupal
Aktualisiert: vor 23 Minuten 3 Sekunden

AddWeb Solution: Reasons To Prove Why Drupal Commerce Is Best Choice For Ecommerce Website

15. Mai 2018 - 8:24

The concept of a global village is getting more and more real with the advancement of ‘online’ world. And online shops share a major part in this advancement. But with the elevated need of building an online store, the options offering platforms to build these stores has also elevated.

Here’s where our experience and expertise come in picture. After 500+ man hours spent over building about 10+ Ecommerce websites, we’ve come to a conclusion that Drupal is indeed the best choice for building an Ecommerce website. So, here are the 11 realistic reasons to guide you through while choosing the best platform for building an Ecommerce website for you; which is undoubtedly Drupal Commerce

 

1. An Array of Inbuilt Features 
Drupal is priorly loaded with all the features that are required for building a website viz., product management system, payment modes, cart management, et al.

 

2. Time-Saving 
Development time reduces since the time consumed in first developing and then custom integrating two separate systems is eliminated.
 

3. SEO Friendly 
Drupal is SEO friendly and hence, helps your website rank higher in the search engine

 

4. Negligible Traffic Issues 
Heavy traffic is never an issue with Drupal since it is backed by a wealthy system to support the traffic.
 

5. Social Media Integration 
Social Media platforms like Facebook, Instagram, Twitter, LinkedIn, etc comes priorly integrated with Drupal. 

 

6. High on Security 
Drupal is high on security grounds and hence, comes up with an inbuilt solution for securing your data/information on the website. 

 

7. Super Easy Data Management 
Data management becomes easy with Drupal since it is the best content management system. 

 

8. Feasible for E-Commerce Websites
Easy to built and run a Drupal-based eCommerce website, whether it is a small size enterprise or large business houses. 

 

9. Inbuilt Plugins for Visitor Analysis  
The inbuilt plugins for visitor reporting and analytics help you to easily evaluate your website without any external support. 

 

10. Customization
Drupal is flexible enough to make your website a customized one. 

 

11. Every Single Code is Free!
Drupal firmly believes in maintaining the integrity, the core of Open Source Community, where nothing is chargeable and every single code is for everyone to use. 


And you thought we’re trying to sell it just because ‘We Drupal Everyday’? Well, good that now you’re aware of the selfless efforts we make to solve your tech-related confusions! We at AddWeb are Friends of Drupal Development.

Chapter Three: Introducing React Comments

14. Mai 2018 - 19:57

Commenting system giant Disqus powers reader conversations on millions of sites, including large publishers like Rolling Stone and the Atlantic. So when Disqus quietly introduced ads into their free plans last year, there was some understandable frustration.

Why did @disqus just add a bunch of ads to my site without my permission? https://t.co/CzXTTuGs67 pic.twitter.com/y2QbFFzM8U

— Harry Campbell (@TheRideshareGuy) February 1, 2017

 

CTI Digital: NWDUG Drupal Contribution Sprints

14. Mai 2018 - 18:48

Last weekend I attended my first ever Drupal Sprint organised by NWDUG.

CTI Digital: NWDUG Drupal Contribution Sprints

14. Mai 2018 - 18:48

Last weekend I attended my first ever Drupal Sprint organised by NWDUG.

Drupal Association blog: Progress and Next Steps for Governance of the Drupal Community

14. Mai 2018 - 16:39

One of the things I love the most about my new role as Community Liaison at the Drupal Association is being able to facilitate discussion amongst all the different parts of our Drupal Community. I have extraordinary privilege of access to bring people together and help work through difficult problems.

The governance of the Drupal project has evolved along with the project itself for the last 17 years. I’m determined in 2018 to help facilitate the next steps in evolving the governance for our growing, active community.

2017 - A Year of Listening

Since DrupalCon Baltimore, the Drupal Community has:

  • Held a number of in-person consultations at DrupalCon Baltimore around the general subject of project governance

  • Ran a series of online video conversations, facilitated by the Drupal Association

  • Ran a series of text-based online conversations, facilitated by members of our community across a number of time zones

  • Gathered for a Governance Round Table at DrupalCon Nashville.

This has all led to a significant amount of feedback.

Whilst I highly recommend reading the original blog post about online governance feedback sessions for a full analysis, there was clearly a need for better clarity, communications, distributing leadership, and evolving governance.

2018 - A Year of Taking Action

There are many things happening in 2018 but I want to concentrate for now on two important activities; how we continue to develop our Values and how we can continue to develop Governance of our community.

So, why am I separating “Values” and “Governance”, surely they are connected? Well, they are connected, but they are also quite different and it is clear we need to define the difference within our community.

In the context of the Drupal Community:

  • “Values” describe the culture and behaviors expected of members of the Drupal community to uphold.

  • “Governance” describes the processes and structure of interaction and decision-making that help deliver the Project’s purpose whilst upholding the Values we agree to work by.

Values What’s happened?

Quoting Dries:

Over the course of the last five months, I have tried to capture our fundamental Values & Principles. Based on more than seventeen years of leading and growing the Drupal project, I tried to articulate what I know are "fundamental truths": the culture and behaviors members of our community uphold, how we optimize technical and non-technical decision making, and the attributes shared by successful contributors and leaders in the Drupal project. 

Capturing our Values & Principles as accurately as I could was challenging work. I spent many hours writing, rewriting, and discarding them, and I consulted numerous people in the process. After a lot of consideration, I ended up with five value statements, supported by eleven detailed principles.”

The first draft of the Values & Principles was announced to the community at DrupalCon Nashville.

What’s next?

Now that we have the first release of the Values & Principles, we need a process to assist and advise Dries as he updates the Values & Principles. After hearing community feedback, Dries will charter a committee to serve this role. A forthcoming blog post will describe the committee and its charter in more detail.

Community Governance What’s happened?

At DrupalCon Nashville, many useful discussions happened on governance structure and processes.

  • A Drupal Association Board Meeting, with invited community members, met to talk with existing governance groups to find out what is working and not working. We realized that governance of the Drupal Community is large and it is difficult to understand all of the parts. We began to see here a possibility for further action.

  • The Community Conversation, “Governance Retrospective”, helped us to see that improving communications throughout the community is hugely important.

  • The Round Table Discussion, around community governance, brought together Dries, staff of the Drupal Association and Drupal Association Board, representatives of many of our current community working groups, representatives of other interested groups in the community and other community members. This group looked at both Values & Principles but also looked into how we are currently governed as a community and how can improve that.

All these things lead to one of the very best things of the DrupalCon experience; the “hallway track”. More and more throughout DrupalCon Nashville, ideas were formed and people stepped forward to communicate with each other, about how we can improve our governance. This happens all the time when we discuss the code of Drupal; I’m very excited to see it happening in other aspects of our project, too.

What’s next?

A structured approach is needed to ensure all in our community understand how decisions are being made and could have input. Speaking with a number of those involved in many of the discussions above, a consensus developed that we can start putting something into action to address the issues raised. Dries, as Project Lead, has agreed that:

  • A small Governance Task Force would be created for a fixed period of time to work on and propose the following:

    • What groups form the governance of the Drupal community right now?

    • What changes could be made to governance of the Drupal community?

    • How we could improve communication and issue escalation between groups in the community?

  • Task Force membership would be made up of a small group consisting of:

    • Adam Bergstein

    • David Hernandez

    • Megan Sanicki

    • Rachel Lawson

  • This Task Force would discuss whether or not it is beneficial to form a more permanent Governance Working Group, to handle escalated issues from other Working Groups that can be handled without escalation to the Project Lead.

  • This Task Force will propose a structure, processes needed to run this new structure, charters, etc. by end of July 2018 to the Project Lead for approval.

The Governance Task Force begins work immediately. The Charter under which we will work is attached.

I will help to facilitate reporting back regularly as we progress. I look forward to 2018 showing progress on both of these initiatives.

I am, as always, very happy to chat through things - please say hello!

File attachments:  Governance Task Force Charter.pdf

OpenSense Labs: Drupal and GDPR: Everything You Need to Know

14. Mai 2018 - 14:41
Drupal and GDPR: Everything You Need to Know Akshita Mon, 05/14/2018 - 18:11

A lot has been written in and around the EU’s new data privacy compliance - General Data Protection Regulation. As we near 25th May, the search around GDPR compliance is breaking the internet. 

ComputerMinds.co.uk: Got a config schema error on saving a view?

14. Mai 2018 - 13:15

We ran into an obscure error recently, when saving a view that used a custom views plugin. It was supposed to be a very simple extension to core's bundle (content type) filter:

InvalidArgumentException: The configuration property display.default.display_options.filters.bundle.value.article doesn't exist. in Drupal\Core\Config\Schema\ArrayElement->get() (line 76 of [...]/core/lib/Drupal/Core/Config/Schema/ArrayElement.php).

Several contrib projects ran into this issue too: Drupal Commerce, Search API and Webform Views integration. There's even a core issue that looked relevant... but it turned out to be a simple, if perhaps surprising fix. If you ever run into it, it will have a different property (i.e. due to whichever plugin or default value are used).

Our filter was little more than a simple subclass of \Drupal\views\Plugin\views\filter\Bundle, declared for a specific entity type in a very ordinary hook_views_data() (which had even been autogenerated by Drupal Console, so we were fairly confident the problem wasn't there). It just tailored the options form a little to work well for the entity type.

Views plugins all have their own configuration schema - for example, the bundle filter is declared in views.filter.schema.yml to use the 'in_operator', because multiple bundles can be selected for it. When we subclass such a plugin, we do not automatically get to inherit the configuration schema (as that is not part of the PHP class or even its annotation). (Perhaps core could be 'fixed' to recognise this situation ... but there are more important things to work on!)

The solution is to simply copy the schema from the plugin we've extended - in our case, that was 'views.filter.bundle', found in core's 'views.filter.schema.yml' file within views' config/schema sub-directory. Wherever it is, it's probably named 'views.PLUGIN.ID', where 'PLUGIN' is the type of your plugin (e.g. field, filter, area), and 'ID' is the ID in the class annotation of the class your plugin extends. We pasted the schema into our own schema file - which can be named something like /config/schema/mymodule.schema.yml, within our module's directory:

# Replace 'mymodule_special_type' with the ID in your plugin's annotation. views.filter.mymodule_special_type: type: views.filter.in_operator label: 'Customised type selector'

Once that file is in place correctly, I expect you just need to rebuild caches and/or the container for Drupal to happy again. Re-save the form, the error is gone :-)

Configuration schemas should normally help development by catching errors, but as I've written before, an incorrect schema can make things surprisingly difficult. I hope someone else finds this article solves their problem so it doesn't take them as long to figure out! I haven't used it, but it's possible that the Configuration inspector project could help you identify issues otherwise.

Amazee Labs: Progressive Decoupling 2: A How To Guide

14. Mai 2018 - 12:52
Progressive Decoupling 2: A How To Guide

In this series we take a closer look at progressive decoupling in Drupal 8. We go through the project setup, check the tools and libraries and discuss potential applications. The first post of the series showed some new features that made it into JavaScript in the last few years. This time let’s see how to use it in a project.

Blazej Owczarczyk Mon, 05/14/2018 - 12:52

JavaScript transpilation has been added to Drupal core in 8.4.0. The Core JS contribution workflow has been described in the change record Drupal core now using ES6 for JavaScript development. Unfortunately, the scripts mentioned there cannot be used to transpile contrib code yet. There’s an ongoing discussion about that in Use ES6 for contrib and custom JavaScript development. So we need to wait for that to be solved, right?

Not really. It turns out that it is enough to place the package.json file from core/ two levels up in the directory tree (in case of a composer project) and adjust paths to scripts to enjoy modern JS in contrib and custom code. With this file in the repository root we can run

to install dependencies, and we’re good to go with ES6.

will start the watcher, which monitors all .es6.js files and transpiles them to their .js counterparts whenever a change is detected.

The scripts can be invoked in one of 4 ways

To commit or not to commit?

Is it fine to commit the output (.js) files? That depends on the nature of the code. If it’s a contrib module / theme it’s best to do so. Target users shouldn’t be forced to transpile themselves and the build process of Drupal modules is not adjustable at the time off writing this post.

Source maps

Contrib modules would most likely provide just the optimized code (without source maps). The committed source .es6.js files can be used to overwrite the output files with dev features enabled for individual environments if needed.

Custom code

The choice here depends on the hosting platform. If it supports adjusting the build process based on the environment, then the .js files don’t have to be committed at all. The source files are enough and the compilation can be done before each deployment. Source maps can be used for dev and prod should get the optimized build. This is how it looks like in an .amazee.io.yml file for instance:

As with every artifact, ruling out the compiled versions of js files from the repository makes the development process smoother, mainly by reducing conflicts. On the other hand, it doesn’t have to be a big problem if the team is small enough.

Example

Here’s a recipe for adding an example ES6 library to a theme.

  1. Add this package.json file the root of your project
  2. Install dependencies
  3. Start the file watcher
  4. Add a library definition to package_name.libraries.yml in your module or theme.
  5. Create the index file (js/mylib/index.es6.js)
  6. Save the file and check the terminal window with the file watcher, js/mylib/index.es6.js should be mentioned there and the compiled version - index.js - should be created next to the source file. The library is now ready to be used in a module or theme.

    That’s it for setting up ES6 in a project. In the next post we’ll take a look at polyfills, talk about the fetch API and a see how to use async functions - the undeniable game changer from ES8.

    If you want to learn more about Drupal libraries check these out

    Amazee Labs: Amazeenar #1 - GraphQL & Twig

    14. Mai 2018 - 8:57
    Amazeenar #1 - GraphQL & Twig

    “Absolutely incredible!” - just one quote from our first Amazeenar in which we explore the power of GraphQL Twig. Decoupling Drupal is the future, however, it may be a big leap to learn a whole new development stack. With GraphQL Twig, we can take baby steps with a soft-decoupled approach by writing GraphQL inside our Twig templates.

    Daniel Lemon Mon, 05/14/2018 - 08:57

    On Friday 11th May, Amazee Labs hosted its first Amazeenar - a live video training session presented by Philipp Melab who demonstrated some of the capabilities of GraphQL with the Drupal module GraphQL Twig.

    We started the webinar while a crowd joined live from over 13 countries around the world, including Belgium, Brazil, Canada, South Africa, and as far east as Thailand.

    It felt exciting to have a community of enthusiastic people connecting from so many different locations across the globe. This once again reinforced that Drupal is really about coming for the code and staying for the community.

    Philipp dove into the talk by giving us a quick introduction to GraphQL, with an example query for us to better understand the concept:

    query { node:nodeById(id: "1") { title:entityLabel related:relatedNodes { title:entityLabel } } }

    Running this example GraphQL query would give us the following JSON response:

    { “node”: { “title”: “Article A”, “related” { { “title”: “Article B” }, { “title”: “Article C” } } } }

    Inversion of control

    Philipp then explained the need for decoupling, providing us with a good overview of the fundamental differences between standard Drupal and Decoupled Drupal, in which the control moves from a push approach to a pull approach.

    React is great, but the inversion of control is crucial.

    Enable the template to define its data requirements, allow's us to achieve a clear data flow with significantly increased readability and maintainability. The GraphQL Twig module allows us to add GraphQL queries to any Twig template, which is then processed during rendering and used to populate the template with data.

    Philipp entertained the audience with a live working demo in which, together, we learnt how to enhance the default “powered by Drupal” block to pull in the username of user 1. He then blew our minds with an additional surprise - pulling in the current number of open bug issues for Drupal Core via the GraphQL XML submodule.

    Catchup

    Did you miss the webinar? Don’t fret; we recorded everything!

    Amazee Labs would like to thank everyone who attended the live session, we enjoyed being able to share this with you, and we look forward to hosting another Amazeenar shortly.

    ThinkShout: Preparing for the GDPR

    13. Mai 2018 - 14:00
    “We’ve recently updated our privacy policy.”

    If you’ve ever given your email address to an online store, entity, social media platform or done just about anything online, then you’ve probably received the above notice in your inbox from those entities with increasing regularity over the last month or two.

    Most of these notices are related to the European Union’s General Data Protection Regulations (GDPR) that are going into effect later this month on May 25, 2018.

    To be clear, we at ThinkShout are not lawyers and we strongly encourage our clients and anyone collecting user information in some way, shape, or form to seek legal counsel for your own specific obligations related to the GDPR. Here’s how we’re viewing the regulations and what actions we are taking at ThinkShout.

    The big picture

    The regulations apply specifically to organizations that collect or process data associated with EU citizens. The overall intent is to give EU citizens control over how their own data is collected and used. The stick that’s being wielded to enforce the regulations is the possibility of fines of up to €20 million or 4% of an organization’s global annual revenue (whichever is greater). Charitable organizations are not exempted from these penalties, however it’s likely that the steep fines will be for recurring or significant privacy issues and that the focus will be on fixing any issues that are discovered. There are questions about enforceability (particularly in the USA) that will likely need to be settled in court, but many of the regulations reflect smart privacy practices regardless of the penalties. All the chatter and hand wringing about the GDPR has led to a fast growing industry of businesses offering compliance audits, consulting and technical solutions to the regulations. Some of the vendors offering these services are legitimate, while many are simply designed to sell products or services based on embellished fears.

    The principles of the GDPR can be broadly summed up as protecting personal data by allowing individuals to choose what data they allow to be collected, how that data is used or processed, and gives them control over that data even after it’s been collected. The UK’s Information Commissioner’s Office provides an easy to read guide to the GDPR that goes into detail on the various provisions while the EU provides a more graphical explanation. That last link might be more palatable for the visual learners reading this.

    Portion of the EU’s graphical explanation of GDPR - full explanation can be found here.

    Does the GDPR apply to you and your users?

    In short, probably. While compliance is technically only needed when handling data for EU citizens, discerning who is and isn’t a EU citizen can be difficult, and compliance in many cases isn’t all that cumbersome.

    Documentation and communication are two of the key areas of responsibility.

    Start with an audit of the data you collect from users, the data you collect from other sources and what is done with that data. Note that this isn’t just about new data but also any data already in your various systems (website, Salesforce, spreadsheets, etc.). Once you know what user information you have and why you have it, communicate that information to both your staff and your users by updating your privacy notices, and emailing constituents with that now famous subject line, “We’ve recently updated our privacy policy.”

    Document how your data handling processes are shared with new staff. It’s also a good idea to revise privacy policies written by lawyers to be “concise, transparent, intelligible and easily accessible” and should further be “written in clear plain language.”

    Here’s an example of good privacy notices and requests for consent.

    Basically, ensure that the general population (who did not attend law school) can easily understand the language.

    Processing must be allowed under a lawful basis.

    Any processing of personal data must be supported by both the need to process that data as well as a lawful basis. Out of the eight lawful bases that the GDPR defines, consent, legal obligation and legitimate interest appear to be the most likely to be cited in the work of our clients. For consent to apply, it must be active (opt-in), current, specific and revocable.

    Legal obligation covers data needed for accounting or other legal audit trails. Legitimate interest is less defined, but addresses situations where the usage of the data can be reasonably expected, has minimal privacy impact and there is strong justification for the processing of the data. Using a user’s email address on an account they created to send them a link to reset their password might be an example of legitimate interest as a lawful basis.

    Individuals have defined rights to the protection and usage of their data.
    1. The right to be informed: privacy notices, accurate opt-in information, etc.
    2. The right of access: ability to see any data you have on an individual.
    3. The right to rectification: ability to correct any errors in the data you have - allowing users to update their own profiles covers much of this right.
    4. The right to erasure: ability to request data be removed. This is not all encompassing, nor does it need to be automated. Data needed for legal or other legitimate needs can be retained.
    5. The right to restrict processing: ability to request that their data not be processed but also not deleted.
    6. The right to data portability: ability to request a machine readable export of their data.
    7. The right to object: ability to opt out of marketing, processing based on legitimate interest or processing for research or statistical purposes. The process for opting out must be clearly explained in the privacy notice and at the point of first communication.
    8. Rights in relation to automated decision making and profiling: If you collect data to profile individuals for behavior tracking or marketing purposes then additional regulations apply.
    What about cookies?

    Cookies aren’t specifically called out in the GDPR, however some of the provisions can apply to them. Some experts recommend altering the site behavior to prevent cookies from being created until after the user has provided and the site has recorded consent. Several services seek to provide paid services that support this approach, although altering the code on your site is generally necessary to use them correctly. A few Drupal modules and WordPress plugins also seek to provide this functionality. It is expected that in 2019 the revised e-Privacy Directive will shift some or all of the obligations for managing consent related to cookies to the browser application.

    Recommendations

    We’re recommending that all our clients take the following steps to ensure compliance:

    • Evaluate your organization’s legal needs related to the GDPR. Consulting with your own counsel is recommended.
    • Appoint an internal person to take responsibility for data protection in your organization. While the GDPR includes provisions for appointing a Data Protection Officer (DPO), it’s specifically for public authorities and organizations whose core business is tracking behavior or processing criminal data. Appointing a staff person will help avoid a diffusion of responsibility regarding data security.
    • Audit your data collection and processing (here’s a sample template):
      • What is being held already and what is being collected?
      • Is there data being collected or stored that isn’t needed?
      • How is the collected data is used within the organization?
      • Is there a legal basis for the different pieces of personal data being collected?
      • If consent is the legal basis, is the consent active (opt-in), granular and recent?
    • Review and revise privacy notices and cookie policies to be clearly written and comprehensive. Be sure to include information about third-party data collection (Google Analytics, AddThis, Facebook, etc). Here’s a privacy notice checklist to get you started.
    • Document processes for handling user requests as well as security breaches. Your organization has a month to respond to an individual’s request for export, access, or deletion of their data. In most cases this will currently be a manual process although there is working happening in both the Drupal and WordPress communities to make these request easier to accommodate. If there is a data breach, the GDPR states that the regulating agency must be notified within 72-hours. A good starting point is the Security Breach Response Plan Toolkit.
    • Evaluate if changes to your website (beyond the privacy/cookie notices) are necessary. Consider specifically:
      • Is Google Analytics configured properly? Ensure IP anonymization is enabled, data retention settings are correct and that no personal information is being tracked (check page urls, titles, etc.).
      • What third-party scripts or pixel trackers are included?
      • How is consent being collected in newsletter signup forms?
      • How is consent being collected in user registration forms?
      • Any other places that user data could be collected?
    What’s next for us?

    Like most agencies, we’re continuing to learn more about the GDPR and the implications for our clients. We are working in partnership with them to help them understand and implement changes that might be needed on their sites or their internal processes. Internally we’re providing additional training on the principles of privacy by design to our team. In terms of our open source work we’ll be incorporating MailChimp’s GDPR consent forms into the Drupal MailChimp modules as soon as the functionality is available in their API. We see opportunities for including functionality related to subject access requests (export, deletion, etc) and consent tracking in our RedHen CRM suite of modules as well.

    Bottom line is: this is something we all need to be cognizant of; it’s not solely an EU issue. We’ll continue to keep a close eye on this as GDPR gets rolled out – and there are many resources out there at your disposal (and within this blog post). You can be sure to get the latest from us on this and other digital trends by signing up for our newsletter and following us on twitter. Good luck!

    Matt Glaman: Looking back at my first Drupal events and talks

    13. Mai 2018 - 8:00
    Looking back at my first Drupal events and talks mglaman Sun, 05/13/2018 - 01:00

    I have been spending the month of May cleaning out my basement office. I have found a treasure trove of old papers and random things bringing up memories and other thoughts. Just like my last blog post on an organization's values and the culture it creates. This time I found two legal pads which had my notes and planning for my first Drupal Meetup talk and first came session.

    Jeff Beeman: Setting up a new project using BLT, Dev Desktop, and Lightning

    13. Mai 2018 - 2:43
    Setting up a new project using BLT, Dev Desktop, and Lightning

    This is the first in a series of posts where I'll capture how I built a new Drupal 8 version of jeffbeeman.com using BLT, Dev Desktop, and Lightning. In later posts, I’ll talk about other local development solutions, dependency management, content migration, and how BLT helps me build and deploy artifacts to Acquia Cloud.

    Jeff Beeman Sat, 05/12/2018 - 17:43

    Drupal Europe: Drupal Europe Launches its Industry Verticals — First Up: Media and Publishing

    12. Mai 2018 - 9:42
    Photo by janeb13 on pixabay

    As you’ve probably read in one of our previous blogposts, industry verticals will be a new concept at Drupal Europe. Verticals replace the summits, which typically took place on Monday, before the conference. Sometimes these were in separate a location from the main conference venue, and often the conference ticket did not cover access to the summits. At Drupal Europe summits have become industry verticals and are integrated with the rest of the conference — same location, same ticket. Following is a summary of what to expect in the new verticals at Drupal Europe.

    Verticals bring together people from an industry to:

    • about outstanding projects being developed in their field
    • share their interests and ideas
    • listen and learn together in sessions
    • discuss challenges and develop solutions for their market

    With all the topics under one roof, every session you are interested in is within easy reach. There will be a guide to specific sessions and people that are involved in your industry: Media and Publishing.

    However, nothing will stop you from getting off the beaten track to mix & match your interests! One Drupal Europe ticket gives you access to everything we have on offer: sessions, workshops, panels, sprints, and BoFs. In this example you might start with a great session about a new top-notch publishing site, get dragged into details of the site’s complex workflow rules and later jump into a technical session, to understand a specific decoupled approach the site was built with.

    Photo by Rachael Crowe on Unsplash

    All attendees will enjoy a amazing program providing insights into industry-specific challenges and solutions. The Media and Publishing industry is a great example of an industry that faced huge challenges. It has had to almost reinvent itself to adapt to the digital age in order to be successful in the 21st century with our society’s media consumption shifting from paper to digital.

    We want to provide the best possible lineup of speakers, panels and sessions for the publishing/media vertical. Drupal Europe will be open to allow you to submit your great session idea very soon. As a choice of different verticals will be available on the website, please tag your session as publishing/media to indicate that it is related to this industry. We welcome proposals for all topics related to publishing/media!

    If you have something interesting to share (questions, thoughts, advice), that might help us before we officially open our call for papers, please reach out to hello@drupaleurope.org.

    Photo by rawpixel on Unsplash

    We can’t wait to see you at Drupal Europe, 10th — 14th September in Darmstadt Germany!

    Xeno Media: Communication is the Key to Great Client Relationships

    11. Mai 2018 - 18:52

    We are proud to be recognized for our commitment to our clients!

    We are now featured on Clutch.co with an overall 5-star rating from our clients and on Clutch’s sister website, The Manifest, as one of the top web developers in Chicago. We work to build strategic relationships, not just accounts. We listen to what our clients want to achieve and produce websites to meet those goals. A running theme to the client reviews has been our commitment and open communication.

     

    Clutch is a B2B ratings and reviews platform that connects small and medium-sized businesses with the developer or software solution that best fits their needs. The Manifest is a research platform meant to aid buyers in the awareness, discovery, and decision-making process of selecting an agency to hire for a project. We are thrilled to be a part of both websites, demonstrating our expertise in web development and design services.

     

    Clutch reviews are unique in that they are verified and independent. Their analysts personally interview our clients to understand what it is truly like to partner with Xeno Media. See what some of them had to say:

    “I’ve worked with a variety of outside vendors, and Xeno Media stood out in every aspect of the engagement. I would recommend them to anyone without hesitation.”

    “Xeno Media doesn’t just service us—they take care of us. They actually care about and understand us as a client. The level of trust between us is something that is very rare in a client relationship, but they have ours and we have theirs.”

    “They're always on top of all aspects of the project. If I ask them to do something, it's done immediately. They're really quick.”

    “This was one of the best experiences I’ve ever had working with an outside vendor.”

    Special thanks to our clients for taking the time to share their candid comments with the Clutch analysts. It has been our pleasure working with you! We would also like to thank Clutch for including us in their research. We are looking forward to maintaining our rank as a leading web developer and reading more great reviews soon.

    See full reviews on our Clutch profile!

    Virtuoso Performance: Back in the saddle again - at DrupalCon Nashville

    11. Mai 2018 - 18:06
    Back in the saddle again - at DrupalCon Nashville mikeryan Friday, May 11, 2018 - 11:06am Return to action

    Those of you with an interest in Drupal migration may have noticed my absence in the last several months. A confluence of things led me to take a break from the Drupal community: a bout of physical exhaustion (initially diagnosed as Lyme disease, and then ¯\_(ツ)_/¯); professional exhaustion managing the D8 migration core initiative and maintaining a few contrib modules on top of paid contracts; and emotional exhaustion from the community drama of early last year. I was, frankly, depressed - whether it was more a product of the exhaustion or an underlying contributor to the exhaustion (actually, probably both in a feedback loop), I needed to deal with it. Fortunately, my contract work had been going well enough that I could afford to take a sabbatical for a while (although perhaps not for quite as long as I did????).

    Pro tip: engaging with the real world helps a lot. In my case, my wife and I had moved to Southern Illinois, where we didn’t know anyone, four years ago. While I haven’t exactly been housebound - we attend many local events, eat out at the local restaurants, hike in the Shawnee National Forest, etc. - I wasn’t really engaged with the local community. In the last several months, however, I’ve volunteered (and now become a board member) at the Liberty Theater - making popcorn and selling concessions at many events, as well as building them a new website (to be launched soon - on WordPress, sorry Drupal folks!) and proposing and running their first GoFundMe campaign. I’ve also been playing guitar and bass in weekly jam sessions at our favorite brewpub, and I took the lead in rebooting the series when the original organizer took a sabbatical himself. I’m feeling much better now - it’s good to be back.

    What next?

    Number one - I’m open for contracts again, please let me know if you have some migration work that needs to be done! Drupal shops, I’m happy to subcontract and let you focus on design and site-building while I get your clients’ data moved over into their new site.

    Contribution-wise, I’m keeping my eyes on the migration-related issue queues and helping out a bit where I can, but I’m not going to take on nearly as much as I had before. I’ll probably budget, say, one half-day a week for this for the time being.

    I had a couple of overly-ambitious, everything-but-the-kitchen-sink blog posts covering various migration techniques in the works when I stepped aside - I’ll try to extract some of the best bits into smaller and more targeted posts.

    Finally, I wasn’t completely idle technically during my sabbatical - I took the time to play around with a few different ideas (some migration-related, some not). I’m pursuing one avenue in particular - more on that soon!

    DrupalCon Nashville

    I procrastinated on making a final decision about attending DrupalCon this year. Just at the point I finally committed to going, I got a lucky break when I saw QED42 tweet they had a spare ticket and snatched it up (thanks again, for the ticket and for the beautiful shirt!).

    As always, it was good to catch up face-to-face with so many Drupalists - especially this year, when I haven’t been active online. Among many others, I had some quality time with Dries, which I haven’t had since I was at Acquia, talking about some of my tech efforts above. And borrowing his hat:

    Dries let me wear his hat (and took the picture) pic.twitter.com/VqbrN0bdi2

    — Mike Ryan (@mikeryan776) April 10, 2018

    With camps and cons in recent years, while the tech sessions are most enticing on first glance, they generally cover things I can learn best by digging in and doing back in my home office. So, I tend to attend more “soft” sessions - those focusing on community and self-care. This time, that included (among others) You Matter More Than The Cause (Jeff Eaton) and Growing Our Tribes: Creating Sustainable Micro-Communities in Drupal and how you can help (Shannon Vettes). Eaton’s talk in particular spoke to me (I could have used it a year earlier!) - the expectations that volunteers of a valued community have of each other, and of themselves, can chew them up. I felt badly about “abandoning” the migration initiative as we approached core stability, but I had to step away. Eaton says:

    If a cause can’t live without you suffering for it, then that cause may just deserve to die.

    Fortunately, the migration initiative did not deserve to die - heddn and quietone and many other contributors got the migrate module to stability, and are well on the way to adding migrate_drupal to the stable modules list. Yay!

    Main takeaway: “You are not consumable fuel.”

    I did make it to a couple tech sessions, though, notably Entities 301 - Entities Elsewhere (Ron Northcutt). This was about leveraging swapping storage backends to represent external data as native Drupal entities. This was of particular interest to me, since while we normally think of the Drupal migration system as a means of importing external data into the Drupal database, it should be recognized that the architecture permits us to design destination plugins which write to an external source - another database, or PUT operations on a RESTful API (no, I haven’t yet had occasion to write such a destination plugin). But, if you were to define an entity type using external storage, you wouldn’t have to do anything special to use migration to populate these entities - just use the general purpose entity plugin:

    destination: plugin: entity:my_remote_entity_type One more thing…

    Thinking about online engagement, and also about security (like the occasional emergence of -geddons that take advantage of potentially any open form submissions), I had an idea - why not use Twitter as my commenting system? I.e., embed the tweet promoting a post at the bottom of that post, and people can comment by replying to the tweet.

    Advantages:

    1. I can now disable account creation and comment creation on my site (I’m leaving the comment module in place here for the sake of past comments), improving my site security.
    2. Commenters don’t have to worry about my site’s security if they’re only engaging with Twitter.
    3. Vetting user accounts and reviewing comments for spam are now Twitter’s problems, not mine.
    4. More visibility for commenters - their responses to my blog post will be seen by a lot more people on their Twitter feed (more of whom are predisposed to hear what they want to say) than they would on my blog site.
    5. More visibility for me - other people’s responses on Twitter will help bring attention to my blog posts.
    6. In practice, these days when you tweet a post you’ve made, you’re likely to get more discussion about it on Twitter than you do on your site anyway. Why split the threads?

    Disadvantages:

    1. This would exclude comments by people without a Twitter account. I don’t think this will have a big impact on my lightly-visited blog, but this would be more of an issue with higher-traffic sites trying to be all-inclusive.
    2. Replies would not be visible on the blog page. I don’t think this is a big deal, at least for my site. And, there could be a module for that!

    For now, I’m doing this manually with a text field below the body using the ckeditor_media_embed module - publish the blog post, publish the tweet, add the tweet to the blog post. If this approach to comments seems to work well in practice, it would be nice to have a module that adds a field to the node form for the text you want in the tweet and at publish time creates the tweet and links to it from the post automatically, virtually eliminating the window where the two aren’t in sync.

    Thoughts? Reply to the tweet below!

    Tags Planet Drupal Drupalcon Drupal Use the Twitter thread below to comment on this post:

    Back in the saddle again - at DrupalCon Nashville https://t.co/ZIAVaOfeXL

    — Virtuoso Performance (@VirtPerformance) May 11, 2018

     

    OpenConcept: Let's Rethink Procurement and Get Accessible ICT

    11. Mai 2018 - 16:13

    Everybody wants their website to be accessible. Information and Communication Technologies (ICT) is key for any modern organization. In the Government of Canada, it is a clear part of the Liberal Party's mandate. Setting a strong policy direction is critical, but then what?

    Most departments still see accessibility as a one-liner that can be added to an ICT contract. Then the responsibility for any shortfalls lies on the vendor.

    Sadly, this doesn't work. Accessibility is a journey, not a destination.

    Web accessibility is complicated, the ecosystem and use cases change over time. So what can procurement do to fix this? I've collected a series of examples good accessibility procurement practices. I am hoping to add to this with some good Canadian examples.

    Start Off Right

    The first thing that government RFPs & contracts can do is set a good example.

    So often these documents are overly complicated and are very difficult to understand. Make sure that these documents are written in plain English so they are easy to understand. In writing this blog post I've used the HemingwayApp.com to check that my ideas are easy to understand.

    PDFs have well-known accessibility problems with them. You can make them quite accessible with considerable effort, but it isn't a best practice. Use PDFs only where a signature is required. Otherwise, it would be more accessible to send EPUB or OpenDocument files. Vendors should not be asked to produce more PDFs as part of the contract.

    Focus on the Process

    Don't obsess on the end result. As with security, there are going to be things that are missed. Of course, everyone wants a site that is as accessible as it can be, but ultimately there are other goals as well.

    The earlier accessibility efforts are brought into project planning, the better the results will be. So, vendors should be asked about the process that they use to build in accessibility. You need to know that they have a team onboard who has experience addressing this complex issue. It is also important that there is a clear and open feedback channel for people who face barriers. People with disabilities need to be part of the process.

    It would be good to ask for a description of how the vendor overcame a difficult accessibility challenge. Sometimes you need to make compromises that may balance design or usability requirements. Often there are multiple ways of doing things and no clear best practice.

    Open

    I'm big into open source software (OSS). As Drupal 8 Core Accessibility Maintainer, I'm also not sure how we can make progress without OSS.

    Starting with an existing OSS application means that you are able to begin a project with one that has had some testing already. It will have a list of bugs that are known, and hopefully a longer list of bugs which have been fixed.

    Many people choose Drupal as it is the most accessible platforms in the world. Unfortunately, most do not contribute to improving its accessibility. If a vendor is proposing an open source solution, ask how they are working to improve its accessibility. Contributing back is key to the success of any effective community project. Furthermore, clarify the process when the vendor finds accessibility errors. Vendors should be submitting bug reports and patching the code when errors are found.

    Testing

    The vendor and the client both have a responsibility to do testing as the product is developing. Both should be doing regular spot checks with tools WebAim's WAVE Toolbar. It also takes very little time to learn how to do keyboard only testing. There is no reason why even non-technical people can't use these approaches to catch basic errors.

    Screen reader testing requires considerable experience to be effective. Having a developer run a site through VoiceOver or NVDA just isn't good enough. Contracts should have a component in them to ensure that a 3rd party is engaged to provide an evaluation. It should be considered no different than hiring an editor to review the work of an author. If possible, this evaluation should engage people with disabilities.

    Sales vs Delivery

    The focus needs to be on the process of the vendor and how that will ensure a more accessible delivery. Things like the Voluntary Product Accessibility Template (VPAT) are still essentially sales documents. For some more detailed approaches, see:

     

    OECD Playbook

    Canada is a member of the Organisation for Economic Co-operation and Development. I was happy to be part of a meeting called by Treasury Board to discuss reforming the OECD Playbook for ICT Procurement. It is clear that there is a deep interest in government in doing procurement differently to get more accessible results. A big part of this goes back to learning to embrace open source and to collaborate outside of government silos. It is great to see many instances of people working together to learn how to implement technology better. We need much more of this! 

    Topic: Primary Image: 

    Lullabot: Nightwatch in Drupal Core

    11. Mai 2018 - 13:38

    Drupal 8.6 sees the addition of Node.js based functional browser testing with Nightwatch.js. Nightwatch uses the W3C WebDriver API to perform commands and assertions on browser DOM elements in real-time.

    Up until now, the core method for testing JavaScript and browser interactions in Drupal has been to use either Simpletest or PHPUnit. For tests that require a simulated browser, these have been running on PhantomJS, which is now minimally maintained and has been crashing frequently on Drupal CI (as of 8.5, Drupal can now use Chromedriver for these tests, however the majority of tests are still running on Phantom). This situation requires Drupal themers and front-end developers, working primarily with HTML / CSS / JavaScript, to learn the PHPUnit testing framework and specifics of its implementation within Drupal, instead of being able to write tests for page interactions and JavaScript in JavaScript itself.

    undefined

    Nightwatch is a very popular functional testing framework that has now been integrated into Drupal to allow you to test your JavaScript with JavaScript! Drupal's implementation uses Chromedriver out of the box, however it can also be used with many other browsers via Selenium. The choice to use Chromedriver was because it's available as a standalone package and so doesn't require you to install Java and Selenium, as well as running the tests slightly quicker.

    module.exports = { before: function(browser) { browser .installDrupal(); }, after: function(browser) { browser .uninstallDrupal(); }, 'Visit a test page and create some test page': (browser) => { browser .relativeURL('/test-page') .waitForElementVisible('body', 1000) .assert.containsText('body', 'Test page text') .relativeUrl('/node/add/page') .setValue('input[name=title]', 'A new node') .setValue('input[name="body[0][value]"]', 'The main body') .click('#edit-submit') .end(); }, };

    A very powerful feature of Nightwatch is that you can optionally watch the tests run in real-time in your browser, as well as use the .pause() command to suspend the test and debug directly in the browser.

    Videos require iframe browser support.

    Right now it comes with one example test, which will install Drupal and check for a specific string. The provided .installDrupal command will allow you to pass in different scripts to set Drupal up in various scenarios (e.g., different install profiles). Another available command will record the browser's console log if requested so that you can catch any fun JavaScript errors. The next steps in development for core are to write and provide more helpful commands, such as logging a user in, as well as porting over the 8.x manual testing plan to Nightwatch.

    undefined

    You can try Nightwatch out now by grabbing a pre-release copy of Drupal 8.6.x and following the install instructions, which will show you how to test core functionality, as well as how to use it to test your existing sites, modules, and themes by providing your own custom commands, assertions, and tests. For further information on available features, see the Nightwatch API documentation and guide to creating custom commands and assertions. For core developers and module authors, your Nightwatch tests will now be run by Drupal CI and can be viewed in the test log. For your own projects, you can easily run the tests in something such as CircleCI, which will give you access to artifacts such as screenshots and console logs.

    undefinedundefined

    Amazee Labs: DrupalCamp Transylvania 2018

    11. Mai 2018 - 10:46
    DrupalCamp Transylvania 2018

    If you were in the city of Cluj-Napoca between 4 and 6 May 2018 and walked around The Office, you probably saw over a 100 people from all over the world, wearing the same t-shirts, talking about Drupal. That's because DrupalCamp Transylvania was in town.

    Vasi Chindris Fri, 05/11/2018 - 10:46

    If you know a bit of Romanian history and have heard about Transylvania, you probably know about Vlad the Impaler. If not, then you've probably heard about Dracula. Either way, they're the same person. You may be asking yourself, "What has Dracula got to do with Drupal?". Well, the answer is in the picture below:

    We all want Drupal to be immortal. Because we love developing awesome websites with it.  That said, we must remember one thing, it's not all about work and making money, it's also about having fun using Drupal. That was one of the key points of Robert Douglass' keynote - "My Drupal Mid-Life Crisis".

    One of the most interesting sessions was Larry Garfield's - "The container is a lie!". On reading the title, you'll probably want to check that out, since you most probably use containers (not necessarily Docker containers, although Docker is probably the most used these days) in your everyday work. He spoke of how software runs on modern Linux systems, that we should not think of boats, whales or shipping or even Docker when we hear the word container, and why it is actually useful that modern software is built (runs) on these "lies". These "lies" form part of our everyday work, and more importantly, the deployment to different environments makes it so much easier.

    Another very important topic, not only in the Drupal community but in technology in general, is GDPR (General Data Protection Regulation). Balu Ertl had a great session entitles, Overview of GDPR modules for Drupal, in which he provided an overview of all the modules in Drupal that can help your Drupal site achieve GDPR compliance.

    The conclusion was that we have quite a few modules (9) in this category, some of them available on both Drupal 7 and 8. Some of them implement a small part of the regulations (like the consent for using personal data, the possibility to delete or download all the personal data of a user, the possibility to anonymize user information when dumping a production database, etc.) and many of them implement overlapping features.

    But there seems to be one module, General Data Protection Regulation, which tries to bring all these modules together under one umbrella so that we can have a unified and clear solution for making a site GDPR compliant.

    Another thing that came up during the discussions about this subject, was that this is a really complicated subject for both technical and legal minds, and as such, you'll most probably not be fined immediatly if you're not 100% GDPR compliant on the 26th of May 2018. The most likely scenario is that the authorities will be there to help at first, and only fine you as a last resort. That said, this cannot be confirmed and everything should still be done to be GDPR compliant by the deadline.

    Wait, there's more! While attending Lenard Palko's presentation, we saw this:

    No, we did not watch an episode from Doc McStuffins. This was about Auditing PHP Applications, a session in which Lenard Palko showed us how his team is dealing with auditing PHP applications and what things should we look for when having to do such an audit. He also shared some helpful tools that you should use and how should you structure the report.

    As you can see it was a great DrupalCamp. Nice location, great presenters, lots of people and a dedicated sprint room. So, did we have any time for doing other stuff than coding and talking about Drupal? Yes, we did! We had some great parties each evening and a brave few of us even went for a morning run on Saturday.

    I'm already looking forward to the next DrupalCamp Transylvania in 2019. See you there!

    Agiledrop.com Blog: AGILEDROP: Drupal SEO Tips

    11. Mai 2018 - 2:40
    Drupal is a CMS that’s well-known for its security and flexibility. But do you know of another aspect that Drupal excels at? It’s Drupal’s excellent built-in SEO functionality. While Drupal itself plays pretty well with search engines, there are a host of measures you can take to ensure you stay on top of the SEO game even more. After all, the internet is a competitive market, making a site’s SEO all the more competitive. In this post, I’ll explore some of the most common SEO measures you can take to bolster your Drupal site’s SEO efforts.   The Basics All the basic SEO measures that are… READ MORE

    Acro Media: Drupal Commerce 2: How to Add and Edit Product Content

    10. Mai 2018 - 16:45

    The hardest part of Drupal Commerce 2 is the configuration of it all. Luckily, most store managers and administrators don't need to worry about that part. What they DO need to worry about is how to actually add new products to their stores and manage existing ones. For some ecommerce stores, keeping your product offerings fresh and up-to-date can mean the difference between success and failure. If you're using Drupal Commerce 2, managing your store content is easy!

    In this Acro Media Tech Talk video, we user our Urban Hipster Commerce 2 demo site to show you the Drupal Commerce 2 product interface and how to add and edit products. The products shown will be configured differently than your own, but the same principles will apply no matter what type of product you sell.

    Its important to note that this video was recorded before the official 2.0 release of Drupal Commerce useing a beta release of the Commerce Shipping module. You may see some differences between this video and the current releases. The documentation is also evolving over time.

    Urban Hipster Commerce 2 Demo site

    This video was created using the Urban Hipster Commerce 2 demo site. We've built this site to show the adaptability of the Drupal 8, Commerce 2 platform. Most of what you see is out-of-the-box functionality combined with expert configuration and theming.

    More from Acro Media Drupal modules in this demo