In-Memory-Datenbanken: Software AG zieht mit HANA gleich

heise online Newsticker - 9. November 2017 - 8:00
Die Darmstädter Software AG will mit einer neuen Version ihrer Terracotta DB SAPs Vorzeigeprodukt HANA angreifen. In-Memory-Datenbanken entwickeln sich womöglich zur letzten Alternative zur Cloud.

myDropWizard.com: Using lots of different tools? Do it all in Drupal instead!

Planet Drupal - 9. November 2017 - 7:43

You need a website. You need to send an e-mail newsletter. You need to track (potential) volunteers, donors, or customers. You could use Drupal, Mailchimp and HubSpot. Or you could do it all in Drupal.

We've been using the tools above in our own organization, and we continue to use them. Yet, we've been toying with the idea of moving more of our daily usage to a more Drupal based solution. I'll try to outline some of the pros and cons of each approach. I think you'll see for many organizations the Drupal solution could end-up on the winning side of the decision!

The Heavyweight Single Purpose Tools

We've used a number of we based services at myDropWizard to help keep sales, projects, and customer communication on track.

I'll outline just a few that we use that are very popular. that would make for a good comparison with a Drupal solution.

MailChimp

Currently, we use MailChimp for newsletters. I think MailChimp is a champion product with low prices and great features. MailChimp is probably the most used email newsletter platform, so it's strengths are well known.

Studie: Nur wenige Eltern setzen Jugendschutzprogramme ein

heise online Newsticker - 9. November 2017 - 7:30
Ein Viertel der Erziehungsberechtigten hat einen technischen Filter installiert, der den Nachwuchs vor ungeeigneten Inhalten im Netz bewahren soll. 90 Prozent halten den Jugendschutz für wichtiger als einen leichten Zugriff auf Online-Angebote.

Qualcomm-Atheros: Android-November-Update schließt kritische WLAN-Treiber-Lücken

heise online Newsticker - 9. November 2017 - 7:00
Im Linux-Treiber für WLAN-Chipsätze von Qualcomm-Atheros klaffen Sicherheitslücken, über die ein Angreifer das Gerät mit Hilfe von manipulierten WLAN-Paketen knacken kann. Unter anderem sind davon Android-Geräte der Nexus- und Pixel-Reihen betroffen.

HPE-Server mit Supercomputer-Technik für 32 CPUs und 48 TByte RAM

heise online Newsticker - 9. November 2017 - 7:00
Hewlett Packard Enterprise stellt einen skalierbaren Server mit Intel-CPUs vor, der vor allem Big-Data-Anwendungen beschleunigen soll. Die Shared-Memory-Technik dafür stammt von SGI.

Elevated Third: Marketing Automation, Meet Drupal

Planet Drupal - 8. November 2017 - 22:06
Marketing Automation, Meet Drupal Marketing Automation, Meet Drupal Andy Mead Wed, 11/08/2017 - 13:06

Oh, hi there. I’d be lying if I said I wasn’t expecting you. This is a blog after all. And supposedly people read these things, which is, supposedly, why you’re here. So pull up a seat (if you’re not already sitting) and I’ll tell you why Drupal is a great partner for Marketing Automation.

Ah, Marketing Automation. (Hereafter MA, because why read 7 syllables when you can read 2?) It’s arguably the most hyped business technology of the last decade or so, spoken about in hushed tones, as though simply subscribing to a platform will print money for you. Sadly that’s not the truth. But when used properly with digital strategy, it’s pretty good at what it does: capturing latent demand and turning it into sales. The tricky part is the modifying clause that opened the last sentence, “when used properly.”

What to expect from Marketing Automation?

Marketing Automation tools and platforms these days come loaded with bells and whistles: from custom reporting engines to fancy drag-n-drop campaign UIs, and WYSIWYGs that let marketers build digital assets like landing pages and emails. And yet, despite all that fanciness, it’s still really hard to do Marketing Automation right. Why? Well, leaving aside strategic questions (a massive topic on its own), my own experience with MA always left me wanting two things - expressibility and scalability.

Drupal + Marketing Automation

While publishing workflows in Marketing Automation, tools have improved over the years. They still can’t compete with a CMS; particularly one as powerful as Drupal. Drupal empowers users to express content in terms that go far beyond simple landing pages.

In fact, Drupal is used today for just about anything you can imagine, from powering Fortune 500 marketing websites to running weather.com and acting as the backbone of custom web applications. What’s possible with Drupal is really up to you. Just ask the guy who built it.

So, fine. Drupal is great and everything. But how does it help your marketing? Well, because Drupal is so flexible, you can integrate it with almost anything:  like Google Analytics, Pardot, Marketo, Eloqua, Salesforce, and on, and on, and on. In a quickly changing technology landscape that’s an incredible strength because it acts as the nervous system for your marketing technology stack.

“Marketing technology stack?” Yeah, I don’t like business jargon, either. But, it’s a helpful way to think about digital marketing tools. Because they are just that: tools with strengths and weaknesses. You probably wouldn’t use a screwdriver to drive a nail into the wall. Sure, you could, but there’s a better tool for the job: a hammer. Likewise, your MA platform could power all your digital assets, but there’s a better tool for that job, too: Drupal.

The right tools for the job

In my experience, organizing these tools around their strengths brings better results. And here at Elevated Third, we’ve done that by connecting Drupal to Marketing Automation platforms like Pardot, Marketo, and SharpSpring; using it as the front end for services that are powering marketing programs. And moreover, MA is only a piece of that puzzle. Want to use something like HotJar? Drupal is happy to.

Open source means flexibility 

So where does this flexibility come from? Drupal is Open Source Software and there’s a massive developer community that improves it daily. Probably the strength of open source software is its flexibility.

You don’t like the way something works? Easy. Let’s change it.

Is something broken? No problem, let’s fix it.

Got a new problem that off-the-shelf solutions don’t solve? Well, then, let’s built a solution for it.

Is Drupal the right tool for every job? I’d be lying (again) if I said it was. But it’s the right tool for jobs that require unique, flexible solutions. And it could be the right tool for your job, too. If you are curious, let's talk

Cheeky Monkey Media: The Drupal Checklist Every Developer Needs

Planet Drupal - 8. November 2017 - 21:49
The Drupal Checklist Every Developer Needs cody Wed, 11/08/2017 - 19:49

Are you almost finished setting up your Drupal website? At a glance, everything might look ready to go.

But, before you hit "publish," you need to make sure you haven't made any mistakes.

A writer proofreads before they post an article. Similarly, a developer should double check their work.

The last thing you want is to go live with your site and have something go wrong. Finding problems before you launch can save some headaches and embarrassment.

We've compiled a pre-launch, Drupal checklist. When it's complete, you'll rest easy knowing that your website is ready to go.

Security

Security is the first on this Drupal checklist because it's so important. Of course you want to rest easy knowing that your site is secure when it launches. You also want your users to have peace of mind knowing that their information is safe.

Double checking your site's security will ensure that there's nothing you've missed that could make you vulnerable to hackers.

Evolving Web: Profiling and Optimizing Drupal Migrations with Blackfire

Planet Drupal - 8. November 2017 - 21:34

A few weeks ago, us at Evolving Web finished migrating the Princeton University Press website to Drupal 8. The project was over 70% migrations. In this article, we will see how Blackfire helped us optimize our migrations by changing around two lines of code.

Before we start
  • This article is mainly for PHP / Drupal 8 back-end developers.
  • It is assumed that you know about the Drupal 8 Migrate API.
  • Code performance is analyzed with a tool named Blackfire.
  • Front-end performance analysis is not in the scope of this article.
The Problem

Here are some of the project requirements related to the problem. This would help you get a better picture of what's going on:

  • A PowerShell script exports a bunch of data into CSV files on the client's server.
  • A custom migration plugin PUPCSV uses the CSV files via SFTP.
  • Using hook_cron() in Drupal 8, we check hashes for each CSV.
  • If a file's MD5 hash changes, the migration is queued for import using the Drupal 8 Queue API.
  • The CSV files usually have 2 types of changes:
    • Certain records are updated here and there.
    • Certain records are added to the end of the file.
  • When a migration is executed, migrate API goes line-by-line, doing the following things for every record:
    • Read a record from the data source.
    • Merge data related to the record from other CSV files (kind of an inner join between CSVs).
    • Compute hash of the record and compare it with the hash stored in the database.
    • If a hash is not found in the database, the record is created.
    • If a hash is found and it has changed, the record is updated.
    • If a hash is unchanged, no action is taken.

While running migrations, we figured out that it was taking too much time for migrations to go through the CSV files, simply checking for changes in row hashes. So, for big migrations with over 40,000 records, migrate was taking several minutes to reach the end of file even on a high-end server. Since we were running migrate during cron (with Queue Workers), we had to ensure that any individual migration could be processed below the 3 minute PHP maximum execution time limit available on the server.

Analyzing migrations with Blackfire

At Evolving Web, we usually analyze performance with Blackfire before any major site is launch. Usually, we run Blackfire with the Blackfire Companion which is currently available for Google Chrome and Firefox. However, since migrations are executed using drush, which is a command line tool, we had to use the Blackfire CLI Tool, like this:

$ blackfire run /opt/vendor/bin/drush.launcher migrate-import pup_subjects Processed 0 items (0 created, 0 updated, 0 failed, 0 ignored) - done with 'pup_subjects' Blackfire Run completed

Upon analyzing the Blackfire reports, we found some 50 unexpected SQL queries being triggered from somewhere within a PUPCSV::fetchNextRow() method. Quite surprising! PUPCSV refers to a migrate source plugin we wrote for fetching CSV files over FTP / SFTP. This plugin also tracks a hash of the CSV files and thereby allows us to skip a migration completely if the source files have not changed. If the source hash changes, the migration updates all rows and when the last row has been migrated, we store the file's hash in the database from PUPCSV::fetchNextRow(). As a matter of fact, we are preparing another article about creating custom migrate source plugin, so stay tuned.

We found one database query per row even though no record was being created or updated. Didn't seem to be very harmful until we saw the Blackfire report.

Code before Blackfire

Taking a closer look at the RemoteCSV::fetchNextRow() method, a call to MigrateSourceBase::count() was found. It was found that the count() method was taking 40% of processing time! This is because it was being called for every row in the CSV. Since the source/cache_counts parameter was not set to TRUE in the migration YAML files, the count() method was iterating over all items to get a fresh count for each call! Thus, for a migration with 40,000 records, we were going through 40,000 x 40,000 records and the PHP maximum execution time was being reached even before migrate could get to the last row! Here's a look at the code.

protected function fetchNextRow() { // If the migration is being imported... if (MigrationInterface::STATUS_IMPORTING === $this->migration->getStatus()) { // If we are at the last row in the CSV... if ($this->getIterator()->key() === $this->count()) { // Store source hash to remember the file as "imported". $this->saveCachedFileHash(); } } return parent::fetchNextRow(); }Code after Blackfire

We could have added the cache_counts parameter in our migration YAML files, but any change in the source configuration of the migrations would have made migrate API update all records in all migrations. This is because a row's hash is computed as something like hash($row + $source). We did not want migrate to update all records because we had certain migrations which sometimes took around 7 hours to complete. Hence, we decided to statically cache the total record count to get things back in track:

protected function fetchNextRow() { // If the migration is being imported... if (MigrationInterface::STATUS_IMPORTING === $this->migration->getStatus()) { // Get total source record count and cache it statically. static $count; if (is_null($count)) { $count = $this->doCount(); } // If we are at the last row in the CSV... if ($this->getIterator()->key() === $count) { // Store source hash to remember the file as "imported". $this->saveCachedFileHash(); } } return parent::fetchNextRow(); }Problem Solved. Merci Blackfire!

After the changes, we ran Blackfire again and found things to be 52% faster for a small migration with 50 records.

For a bigger migration with 4,359 records the migration import time reduced from 1m 47s to only 12s which means a 98% improvement. Asking why we didn't include the screenshot for the bigger migration? We did not (or rather could not) generate a report for the big migration because of two reasons:

  • While working, Blackfire stores function call and other information to memory. Running a huge migration with Blackfire might be a bit slow. Besides, our objective was to find the problem and we could do that more easily while looking at smaller figures.
  • When running a migration with thousands of rows, the migration functions are called over thousands of times! Blackfire collects data for each of these function calls, hence, the collected data sometimes becomes too heavy and Blackfire rejects the huge data payload with an error message like this:
The Blackfire API answered with a 413 HTTP error () Error detected during upload: The Blackfire API rejected your payload because it's too big.

Which makes a lot of sense. As a matter of fact, for the other case study given below, we used the --limit=1 parameter to profile code performance for a single row.

A quick brag about another 50% Improvement?

Apart from this jackpot, we also found room for another 50% improvement (from 7h to 3h 32m) for one of our migrations which was using the Touki FTP library. This migration was doing the following:

  • Going through around 11,000 records in a CSV file.
  • Downloading the files over FTP when required.

A Blackfire analysis of this migration revealed something strange. For every row, the following was happening behind the scenes:

  • If a file download was required, we were doing FTP::findFileByName($name).
  • To get the file, Touki was:
    • Getting a list of all files in the directory;
    • Creating File objects for every file;
    • For every file object, various permission, owner and other objects were created.
    • Passing all the files through a callback to see if it's name was $name.
    • If the name was matching, the file was returned and all other File objects were discarded.

Hence, for downloading every file, Touki FTP was creating 11,000 File objects of which it was only using one! To resolve this, we decided to use a lower-level FTP::get($source, $destination) method which helped us bypass all those 50,000 or more objects which were being created per record (approximately, 11,000 * 50,000 or more for all records). This almost halved the import time for that migration when working with all 11,000 records! Here's a screenshot of Blackfire's report for a single row.

So the next time you think something fishy is going on with code you wrote, don't forget to use use Blackfire! And don't forget to leave your feedback, questions and even article suggestions in the comments section below.

More about Blackfire

Blackfire is a code profiling tool for PHP which gives you nice-looking reports about your code's performance. With the help of these reports, you can analyze the memory, time and other resources consumed by various functions and optimize your code where necessary. If you are new to Blackfire, you can try these links:

Apart from all this, the paid version of Blackfire lets you set up automated tests and gives you various recommendations for not only Drupal but various other PHP frameworks.

Next Steps
  • Try Blackfire for free on a sample project of your choice to see what you can find.
  • Watch video tutorials on Blackfire's YouTube channel.
  • Read the tutorial on creating custom migration source plugins written by my colleague (coming soon).
+ more awesome articles by Evolving Web

Zahlen, bitte! Komplexe Zahlen – ein Marketing-Desaster in der Mathematik

heise online Newsticker - 8. November 2017 - 19:30
Wenn eine Werbeagentur neue Zahlen vermarkten sollte, die viel mehr Probleme lösen als die bisherigen, würde sie wohl kaum abschreckende Wörter wie "komplex" oder "imaginär" verwenden ... aber im 17. Jahrhundert fehlten die Werbetexter.

Web Summit 2017: VW und Google forschen zusammen an Quantencomputern

heise online Newsticker - 8. November 2017 - 19:00
Der deutsche Autokonzern und der US-Riese wollen zusammen Anwendungen für Quantencomputer entwickeln, unter anderem um der Datenflut künftiger Mobilitätsanwendungen Herr zu werden.

Elektromobilität: Hannover soll höchste Ladesäulendichte Deutschlands bekommen

heise online Newsticker - 8. November 2017 - 19:00
600 Ladesäulen wollen die Stadtwerke Hannover in den nächsten drei Jahren aufstellen. Der Großraum soll somit Spitzenreiter in Deutschland werden.

"Glasfaser only": Mittelstand und Breitbandbranche sehen Potenzial für neue Arbeitsplätze durch Glasfaser

heise online Newsticker - 8. November 2017 - 19:00
Die kommende Bundesregierung solle beim Breitbandausbau allein auf Glasfaser setzen, meinen die Verbände BVMW und BREKO.

iOS 11.2 bekommt Einführungspreise für Abo-Apps

heise online Newsticker - 8. November 2017 - 19:00
Apple versucht, Entwicklern den Einstieg in Abonnementmodelle zu erleichtern. Mit der nächsten iOS-Version dürfen sie reduzierte Locktarife offerieren.

Lullabot: Styling the WYSIWYG Editor in Drupal 8

Planet Drupal - 8. November 2017 - 18:42

Drupal 8 ships with a built-in WYSIWG editor called CKEditor. It’s great to have it included in core, but I had some questions about how to control the styling. In particular, I wanted the styling in the editor to look like my front-end theme, even though I use an administration theme for the node form. I spent many hours trying to find the answer, but it turned out to be simple if a little confusing.

In my example, I have a front-end theme called “Custom Theme” that extends the Bootstrap theme. I use core’s “Seven” theme as an administration theme, and I checked the box to use the administration theme for my node forms. 

My front end theme adds custom fonts to Bootstrap and uses a larger than normal font, so it’s distinctively different than the standard styling that comes with the WYSIWYG editor. 

Front End Styling undefined WYSIWYG Styling

Out of the box, the styling in the editor looks very different than my front-end theme. The font family and line height are wrong, and the font size is too small.

undefined

It turns out there are two ways to alter the styling in the WYSIWYG editor, adding some information to the default theme’s info.yml file, or implementing HOOK_ckeditor_css_alter() in either a module or in the theme. The kicker is that the info changes go in the FRONT END theme, even though I’m using an admin theme on the node form.

I added the following information to my default theme info file, custom_theme.info.yml. The font-family.css and style.css files are the front-end theme CSS files that I want to pass into the WYSIWYG editor. Even if I select the option to use the front-end theme for the node form, the CSS from that theme will not make it into the WYSIWYG editor without making this change, so this is necessary whether or not you use an admin theme on the node form!  

name: "Custom Theme" description: A subtheme of Bootstrap theme for Drupal 8. type: theme core: 8.x base theme: bootstrap ckeditor_stylesheets: - https://fonts.googleapis.com/css?family=Open+Sans - css/font-family.css - css/style.css libraries: ... WYSIWYG Styling

After this change, the font styles in the WYSIWYG editor match the text in the primary theme.

undefined

When CKEditor builds the editor iframe, it checks to see which theme is the default theme, then looks to see if that theme has values in the info.yml file for ckeditor_stylesheets. If it finds anything, it adds those CSS files to the iframe. Relative CSS file URLs are assumed to be files in the front-end theme’s directory, or you can use absolute URLs to other files.

The contributed Bootstrap module does not implement ckeditor_stylesheets, so I had to create a sub-theme to take advantage of this. I always create a sub-theme anyway, to add in the little tweaks I want to make. In this case, my sub-theme also uses a Google font instead of the default font, and I can also pass that font into the WYSIWYG editor.

TaDa!

That was easy to do, but it took me quite a while to understand how it worked. So I decided to post it here in case anyone else is as confused as I was.

More Information

To debug this further and understand how to impact the styling inside the WYSIWYG editor, you can refer to the relevant code from two files in core, ckeditor.module:  

/** * Retrieves the default theme's CKEditor stylesheets. * * Themes may specify iframe-specific CSS files for use with CKEditor by * including a "ckeditor_stylesheets" key in their .info.yml file. * * @code * ckeditor_stylesheets: * - css/ckeditor-iframe.css * @endcode */ function _ckeditor_theme_css($theme = NULL) { $css = []; if (!isset($theme)) { $theme = \Drupal::config('system.theme')->get('default'); } if (isset($theme) && $theme_path = drupal_get_path('theme', $theme)) { $info = system_get_info('theme', $theme); if (isset($info['ckeditor_stylesheets'])) { $css = $info['ckeditor_stylesheets']; foreach ($css as $key => $url) { if (UrlHelper::isExternal($url)) { $css[$key] = $url; } else { $css[$key] = $theme_path . '/' . $url; } } } if (isset($info['base theme'])) { $css = array_merge(_ckeditor_theme_css($info['base theme']), $css); } } return $css; }

and Plugin/Editor/CKEditor.php:  

/** * Builds the "contentsCss" configuration part of the CKEditor JS settings. * * @see getJSSettings() * * @param \Drupal\editor\Entity\Editor $editor * A configured text editor object. * @return array * An array containing the "contentsCss" configuration. */ public function buildContentsCssJSSetting(Editor $editor) { $css = [ drupal_get_path('module', 'ckeditor') . '/css/ckeditor-iframe.css', drupal_get_path('module', 'system') . '/css/components/align.module.css', ]; $this->moduleHandler->alter('ckeditor_css', $css, $editor); // Get a list of all enabled plugins' iframe instance CSS files. $plugins_css = array_reduce($this->ckeditorPluginManager->getCssFiles($editor), function($result, $item) { return array_merge($result, array_values($item)); }, []); $css = array_merge($css, $plugins_css); $css = array_merge($css, _ckeditor_theme_css()); $css = array_map('file_create_url', $css); $css = array_map('file_url_transform_relative', $css); return array_values($css); }

Elektronische Gesundheitskarte: Starttermin der Online-Anbindung verschoben

heise online Newsticker - 8. November 2017 - 18:30
Mit Billigung durch den Bundesrat müssen alle Arztpraxen und Krankenhäuser nun bis zum 31.Dezember 2018 ihren Online-Anschluss für die Gesundheitskarte freigeschaltet haben. Der 30. Juni 2018 war als Termin unrealistisch geworden.

Fachkräftemangel: Bitkom zählt 55.000 offene Stellen für IT-Spezialisten

heise online Newsticker - 8. November 2017 - 18:30
Gegenüber dem Vorjahr ist die Zahl der offenen Stellen für IT-Fachkräfte um 8 Prozent gewachsen, teilt der IT-Branchenverband Bitkom mit.

Xbox One X: Microsofts 4K-Spielkonsole kommt jetzt auf den Markt

heise online Newsticker - 8. November 2017 - 18:00
Gut ein Jahr nach der PS4 Pro hat nun auch Microsoft endlich eine Spielkonsole für 4K-Fernseher am Start. Trotz leistungsstarker Hardware bleibt die Konsole meistens flüsterleise.

Kommentar: Das Überleben von Intel steht auf dem Spiel

heise online Newsticker - 8. November 2017 - 17:30
Intel hat sich trotz Milliardengewinnen jahrelang auf der Schwäche der Konkurrenten ausgeruht und nun keine Antworten auf die Zukunftsmärkte parat, kommentiert Martin Fischer.

Löchrige TLS-Verbindungen: Fortinet stopft acht Jahre alte Project-Mogul-Lücke in FortiOS

heise online Newsticker - 8. November 2017 - 17:00
Fehler beim Aufbrechen von TLS-Verbindungen bei Fortinet-Geräten erlauben es Angreifern, beliebige Informationen in gesicherte Verbindungen einzuschleusen. Ähnliche Angriffe sind bereits seit acht Jahren bekannt.

Amazon Marketplace: Händler erhielten kein Geld

heise online Newsticker - 8. November 2017 - 17:00
Eine Woche lang zahlte Amazon seinen Händlern kein Geld aus – offenbar gab es komplexe technische Probleme. Die ausstehenden Zahlungen schränkten viele Händler ein – und das zum anlaufenden Weihnachtsgeschäft.