c't uplink 19.9: Die Geschichte des Computers, erzählt von Andreas Stiller

heise online Newsticker - 5. Dezember 2017 - 7:45
In unserer 200. Folge – es begann ja mit Episode 0 – blicken wir mit einem c't-Urgestein einmal ganz weit zurück: Andreas Stiller erzählt Jan-Keno Janssen nicht nur von seinem ersten Computer, sondern was danach noch alles kam.

PreviousNext: Using ES6 in your Drupal Components

Planet Drupal - 5. Dezember 2017 - 1:22

With the release of Drupal 8.4.x and its use of ES6 (Ecmascript 2015) in Drupal core we’ve started the task of updating our jQuery plugins/widgets to use the new syntax. This post will cover what we’ve learnt so far and what the benefits are of doing this.

by Rikki Bochow / 5 December 2017

If you’ve read my post about the Asset Library system you’ll know we’re big fans of the Component-Driven Design approach, and having a javascript file per component (where needed of course) is ideal. We also like to keep our JS widgets generic so that the entire component (entire styleguide for that matter) can be used outside of Drupal as well. Drupal behaviours and settings are still used but live in a different javascript file to the generic widget, and simply call it’s function, passing in Drupal settings as “options” as required.

Here is an example with an ES5 jQuery header component, with a breakpoint value set somewhere in Drupal:

@file header.js (function ($) { // Overridable defaults $.fn.header.defaults = { breakpoint: 700, toggleClass: 'header__toggle', toggleClassActive: 'is-active' }; $.fn.header = function (options) { var opts = $.extend({}, $.fn.header.defaults, options); return this.each(function () { var $header = $(this); // do stuff with $header } })(jQuery); @file header.drupal.js (function ($, Drupal, drupalSettings) { Drupal.behaviors.header = { attach: function (context) { $('.header', context).header({ breakpoint: drupalSettings.my_theme.header.breakpoint }); } }; })(jQuery, Drupal, drupalSettings);

Converting these files into a different language is relatively simple as you can do one at a time and slowly chip away at the full set. Since ES6 is used in the popular JS frameworks it’s a good starting point for slowly moving towards a “progressively decoupled” front-end.

Support for ES6

Before going too far I should mention support for this syntax isn’t quite widespread enough yet! No fear though, we just need to add a “transpiler” into our build tools. We use Babel, with the babel-preset-env, which will convert our JS for us back into ES5 so that the required older browsers can still understand it.

Our Gulp setup will transpile any .es6.js file and rename it (so we’re not replacing our working file), before passing the renamed file into out minifying Gulp task.

With the Babel ENV preset we can specify which browsers we actually need to support, so that we’re doing the absolute minimum transpilation (is that a word?) and keeping the output as small as possible. There’s no need to bloat your JS trying to support browsers you don’t need to!

import gulp from 'gulp'; import babel from 'gulp-babel'; import path from 'path'; import config from './config'; // Helper function for renaming files const bundleName = (file) => { file.dirname = file.dirname.replace(/\/src$/, ''); file.basename = file.basename.replace('.es6', ''); file.extname = '.bundle.js'; return file; }; const transpileFiles = [ `${config.js.src}/**/*.js`, `${config.js.modules}/**/*.js`, // Ignore already minified files. `!${config.js.src}/**/*.min.js`, `!${config.js.modules}/**/*.min.js`, // Ignore bundle files, so we don’t transpile them twice (will make more sense later) `!${config.js.src}/**/src/*.js`, `!${config.js.modules}/**/src/*.js`, `!${config.js.src}/**/*.bundle.js`, `!${config.js.modules}/**/*.bundle.js`, ]; const transpile = () => ( gulp.src(transpileFiles, { base: './' }) .pipe(babel({ presets: [['env', { modules: false, useBuiltIns: true, targets: { browsers: ["last 2 versions", "> 1%"] }, }]], })) .pipe(rename(file => (bundleName(file)))) .pipe(gulp.dest('./')) ); transpile.description = 'Transpile javascript.'; gulp.task('scripts:transpile', transpile);

Which uses:

$ yarn add path gulp gulp-babel babel-preset-env --dev

On a side note, we’ll be outsourcing our entire Gulp workflow real soon. We’re just working through a few extra use cases for it, so keep an eye out!

Learning ES6

Reading about ES6 is one thing but I find getting into the code to be the best way for me to learn things. We like to follow Drupal coding standards so point our eslint config to extend what’s in Drupal core. Upgrading to 8.4.x obviously threw a LOT of new lint errors, and was usually disabled until time permitted their correction. But you can use these errors as a tailored ES6 guide. Tailored because it’s directly applicable to how you usually write JS (assuming you wrote the first code).

Working through each error, looking up the description, correcting it manually (as opposed to using the --fix flag) was a great way to learn it. It took some time, but once you understand a rule you can start skipping it, then use the --fix flag at the end for a bulk correction.

Of course you're also a Google away from a tonne of online resources and videos to help you learn if you prefer that approach!

ES6 with jQuery

Our original code is usually in jQuery, and I didn’t want to add removing jQuery into the refactor work, so currently we’re using both which works fine. Removing it from the mix entirely will be a future task.

The biggest gotcha was probably our use of this, once converted to arrow functions needed to be reviewed. Taking our header example from above:

return this.each(function () { var $header = $(this); }

Once converted into an arrow function, using this inside the loop is no longer scoped to the function. It doesn’t change at all - it’s not an individual element of the loop anymore, it’s still the same object we’re looping through. So clearly stating the obj as an argument of the .each() function lets us access the individual element again.

return this.each((i, obj) => { const $header = $(obj); }

Converting the jQuery plugins (or jQuery UI widgets) to ES6 modules was a relatively easy task as well… instead of:

(function ($) { // Overridable defaults $.fn.header.defaults = { breakpoint: 700, toggleClass: 'header__toggle', toggleClassActive: 'is-active' }; $.fn.header = function (options) { var opts = $.extend({}, $.fn.header.defaults, options); return this.each(function () { var $header = $(this); // do stuff with $header } })(jQuery);

We just make it a normal-ish function:

const headerDefaults = { breakpoint: 700, toggleClass: 'header__toggle', toggleClassActive: 'is-active' }; function header(options) { (($, this) => { const opts = $.extend({}, headerDefaults, options); return $(this).each((i, obj) => { const $header = $(obj); // do stuff with $header }); })(jQuery, this); } export { header as myHeader }

Since the exported ES6 module has to be a top level function, the jQuery wrapper was moved inside it, along with passing through the this object. There might be a nicer way to do this but I haven't worked it out yet! Everything inside the module is the same as I had in the jQuery plugin, just updated to the new syntax.

I also like to rename my modules when I export them so they’re name-spaced based on the project, which helps when using a mix of custom and vendor scripts. But that’s entirely optional.

Now that we have our generic JS using ES6 modules it’s even easier to share and reuse them. Remember our Drupal JS separation? We no longer need to load both files into our theme. We can import our ES6 module into our .drupal.js file then attach it as a Drupal behaviour. 

@file header.drupal.js import { myHeader } from './header'; (($, { behaviors }, { my_theme }) => { behaviors.header = { attach(context) { myHeader.call($('.header', context), { breakpoint: my_theme.header.breakpoint }); } }; })(jQuery, Drupal, drupalSettings);

So a few differences here, we're importing the myHeader function from our other file,  we're destructuring our Drupal and drupalSettings arguments to simplify them, and using .call() on the function to pass in the object before setting its arguments. Now the header.drupal.js file is the only file we need to tell Drupal about.

Some other nice additions in ES6 that have less to do with jQuery are template literals (being able to say $(`.${opts.toggleClass}`) instead of $('.' + opts.toggleClass')) and the more obvious use of const and let instead of var , which are block-scoped.

Importing modules into different files requires an extra step in our build tools, though. Because browser support for ES6 modules is also a bit too low, we need to “bundle” the modules together into one file. The most popular bundler available is Webpack, so let’s look at that first.

Bundling with Webpack

Webpack is super powerful and was my first choice when I reached this step. But it’s not really designed for this component based approach. Few of them are truly... Bundlers are great for taking one entry JS file which has multiple ES6 modules imported into it. Those modules might be broken down into smaller ES6 modules and at some level are components much like ours, but ultimately they end up being bundled into ONE file.

But that’s not what I wanted! What I wanted, as it turned out, wasn’t very common. I wanted to add Webpack into my Gulp tasks much like our Sass compilation is, taking a “glob” of JS files from various folders (which I don’t really want to have to list), then to create a .bundle.js file for EACH component which included any ES6 modules I used in those components.

The each part was the real clincher. Getting multiple entry points into Webpack is one thing, but multiple destination points as well was certainly a challenge. The vinyl-named npm module was a lifesaver. This is what my Gulp talk looked like:

import gulp from 'gulp'; import gulp-webpack from 'webpack-stream'; import webpack from 'webpack'; // Use newer webpack than webpack-stream import named from 'vinyl-named'; import path from 'path'; import config from './config'; const bundleFiles = [ config.js.src + '/**/src/*.js', config.js.modules + '/**/src/*.js', ]; const bundle = () => ( gulp.src(bundleFiles, { base: "./" }) // Define [name] with the path, via vinyl-named. .pipe(named((file) => { const thisFile = bundleName(file); // Reuse our naming helper function // Set named value and queue. thisFile.named = thisFile.basename; this.queue(thisFile); })) // Run through webpack with the babel loader for transpiling to ES5. .pipe(gulp-webpack({ output: { filename: '[name].bundle.js', // Filename includes path to keep directories }, module: { loaders: [{ test: /\.js$/, exclude: /node_modules/, loader: 'babel-loader', query: {   presets: [['env', {   modules: false,   useBuiltIns: true,   targets: { browsers: ["last 2 versions", "> 1%"] },   }]],   }, }], }, }, webpack)) .pipe(gulp.dest('./')) // Output each [name].bundle.js file next to it’s source ); bundle.description = 'Bundle ES6 modules.'; gulp.task('scripts:bundle', bundle);

Which required:

$ yarn add path webpack webpack-stream babel-loader babel-preset-env vinyl-named --dev

This worked. But Webpack has some boilerplate JS that it adds to its bundle output file, which it needs for module wrapping etc. This is totally fine when the output is a single file, but adding this (exact same) overhead to each of our component JS files, it starts to add up. Especially when we have multiple component JS files loading on the same page, duplicating that code.

It only made each component a couple of KB bigger (once minified, an unminified Webpack bundle is much bigger), but the site seemed so much slower. And it wasn’t just us, a whole bunch of our javascript tests started failing because the timeouts we’d set weren’t being met. Comparing the page speed to the non-webpack version showed a definite impact on performance.

So what are the alternatives? Browserify is probably the second most popular but didn’t have the same ES6 module import support. Rollup.js is kind of the new bundler on the block and was recommended to me as a possible solution. Looking into it, it did indeed sound like the lean bundler I needed. So I jumped ship!

Bundling with Rollup.js

The setup was very similar so it wasn’t hard to switch over. It had a similar problem about single entry/destination points but it was much easier to resolve with the ‘gulp-rollup-each’ npm module. My Gulp task now looks like:

import gulp from 'gulp'; import rollup from 'gulp-rollup-each'; import babel from 'rollup-plugin-babel'; import resolve from 'rollup-plugin-node-resolve'; import commonjs from 'rollup-plugin-commonjs'; import path from 'path'; import config from './config'; const bundleFiles = [ config.js.src + '/**/src/*.js', config.js.modules + '/**/src/*.js', ]; const bundle = () => { return gulp.src(bundleFiles, { base: "./" }) .pipe(rollup({ plugins: [ resolve(), commonjs(), babel({ presets: [['env', { modules: false, useBuiltIns: true, targets: { browsers: ["last 2 versions", "> 1%"] }, }]], babelrc: false, plugins: ['external-helpers'], }) ] }, (file) => { const thisFile = bundleName(file); // Reuse our naming helper function return { format: 'umd', name: path.basename(thisFile.path), }; })) .pipe(gulp.dest('./')); // Output each [name].bundle.js file next to it’s source }; bundle.description = 'Bundle ES6 modules.'; gulp.task('scripts:bundle', bundle);

We don’t need vinyl-named to rename the file anymore, we can do that as a callback of gulp-rollup-each. But we need a couple of extra plugins to correctly resolve npm module paths.

So for this we needed:

$ yarn add path gulp-rollup-each rollup-plugin-babel babel-preset-env rollup-plugin-node-resolve rollup-plugin-commonjs --dev

Rollup.js does still add a little bit of boilerplate JS but it’s a much more acceptable amount. Our JS tests all passed so that was a great sign. Page speed tests showed the slight improvement I was expecting, having bundled a few files together. We're still keeping the original transpile Gulp task too for ES6 files that don't include any imports, since they don't need to go through Rollup.js at all.

Webpack might still be the better option for more advanced things that a decoupled frontend might need, like Hot Module Replacement. But for simple or only slightly decoupled components Rollup.js is my pick.

Next steps

Some modern browsers can already support ES6 module imports, so this whole bundle step is becoming somewhat redundant. Ideally the bundled file with it’s overhead and old fashioned code is only used on those older browsers that can’t handle the new and improved syntax, and the modern browsers use straight ES6...

Luckily this is possible with a couple of script attributes. Our .bundle.js file can be included with the nomodule attribute, alongside the source ES6 file with a type=”module” attribute. Older browsers ignore the type=module file entirely because modules aren’t supported and browsers that can support modules ignore the ‘nomodule’ file because it told them to. This article explains it more.

Then we'll start replacing the jQuery entirely, even look at introducing a Javascript framework like React or Glimmer.js to the more interactive components to progressively decouple our front-ends!

Tagged JavaScript, ES6, Progressive Decoupling

Dries Buytaert: We have 10 days to save net neutrality

Planet Drupal - 4. Dezember 2017 - 20:51

Last month, the Chairman of the Federal Communications Commission, Ajit Pai, released a draft order that would soften net neutrality regulations. He wants to overturn the restrictions that make paid prioritization, blocking or throttling of traffic unlawful. If approved, this order could drastically alter the way that people experience and access the web. Without net neutrality, Internet Service Providers could determine what sites you can or cannot see.

The proposed draft order is disheartening. Millions of Americans are trying to save net neutrality; the FCC has received over 5 million emails, 750,000 phone calls, and 2 million comments. Unfortunately this public outpouring has not altered the FCC's commitment to dismantling net neutrality.

The commission will vote on the order on December 14th. We have 10 days to save net neutrality.

Although I have written about net neutrality before, I want to explain the consequences and urgency of the FCC's upcoming vote.

What does Pai's draft order say?

Chairman Pai has long been an advocate for "light touch" net neutrality regulations, and claims that repealing net neutrality will allow "the federal government to stop micromanaging the Internet".

Specifically, Pai aims to scrap the protection that classifies ISPs as common carriers under Title II of the Communications Act of 1934. Radio and phone services are also protected under Title II, which prevents companies from charging unreasonable rates or restricting access to services that are critical to society. Pai wants to treat the internet differently, and proposes that the FCC should simply require ISPs "to be transparent about their practices". The responsibility of policing ISPs would also be transferred to the Federal Trade Commission. Instead of maintaining the FCC's clear-cut and rule-based approach, the FTC would practice case-by-case regulation. This shift could be problematic as a case-by-case approach could make the FTC a weak consumer watchdog.

The consequences of softening net neutrality regulations

At the end of the day, frail net neutrality regulations mean that ISPs are free to determine how users access websites, applications and other digital content.

It is clear that depending on ISPs to be "transparent" will not protect against implementing fast and slow lanes. Rolling back net neutrality regulations means that ISPs could charge website owners to make their website faster than others. This threatens the very idea of the open web, which guarantees an unfettered and decentralized platform to share and access information. Gravitating away from the open web could create inequity in how communities share and express ideas online, which would ultimately intensify the digital divide. This could also hurt startups as they now have to raise money to pay for ISP fees or fear being relegated to the "slow lane".

The way I see it, implementing "fast lanes" could alter the technological, economic and societal impact of the internet we know today. Unfortunately it seems that the chairman is prioritizing the interests of ISPs over the needs of consumers.

What can you can do today

Chairman Pai's draft order could dictate the future of the internet for years to come. In the end, net neutrality affects how people, including you and me, experience the web. I've dedicated both my spare time and my professional career to the open web because I believe the web has the power to change lives, educate people, create new economies, disrupt business models and make the world smaller in the best of ways. Keeping the web open means that these opportunities can be available to everyone.

If you're concerned about the future of net neutrality, please take action. Share your comments with the U.S. Congress and contact your representatives. Speak up about your concerns with your friends and colleagues. Organizations like The Battle for the Net help you contact your representatives — it only takes a minute!

Now is the time to stand up for net neutrality: we have 10 days and need everyone's help.

Vor 75 Jahren: Die erste künstliche nukleare Kettenreaktion

heise online Newsticker - 4. Dezember 2017 - 20:00
Am 2. Dezember 1942 gelang Enrico Fermi und Leo Szilard an der Universität Chicago ein Durchbruch: 410 Tonnen Graphit, Uran und Uranoxid lieferten 28 Minuten lang eine stabile Kettenreaktion. Der Reaktor war der erste Schritt zur Atombombe.

Western Digital entwickelt RISC-V-Controller für Festplatten mit Transmeta-Gründer

heise online Newsticker - 4. Dezember 2017 - 20:00
Künftige Controller für Speichersysteme von WD setzen auf Künstliche Intelligenz und die offene Architektur RISC-V, um höhere Leistung und neue Funktionen zu realisieren.

macOS High Sierra: Update und Neuinstallation bringen Root-Lücke zurück

heise online Newsticker - 4. Dezember 2017 - 19:00
Wer macOS 10.13 derzeit neu installiert oder auf 10.13.1 aktualisiert, hat wieder ein offenes System – wird von der Software-Aktualisierung aber in Sicherheit gewiegt.

Google bringt eine API für die VR/AR-Bibliothek Poly

heise online Newsticker - 4. Dezember 2017 - 19:00
Die Schnittstelle bietet eine direkte Anbindung an die Plattform zum Teilen von Objekten für AR- und VR-Anwendungen. Neben der REST API gibt es Tools zum Integrieren der Objekte in diverse Anwendungen.

Acquia Lightning Blog: Migrating to Content Moderation with Lightning

Planet Drupal - 4. Dezember 2017 - 18:38
Migrating to Content Moderation with Lightning Adam Balsam Mon, 12/04/2017 - 11:38

NOTE: This blog post is about a future release of Lightning. Lightning 2.2.4, with the migration path to Content Moderation, will be released Wednesday, December 6th.

The second of two major migrations this quarter is complete! Lightning 2.2.4 will migrate you off of Workbench Moderation and onto Core Workflows and Content Moderation. (See our blog post about Core Media, our first major migration.)

The migration was a three-headed beast:

  1. The actual migration which included migrating states and transitions into Workflows and migrating the states of individual entities into Content Moderation.
  2. Making sure other Lightning Workflow features continued to work with Content Moderation, including the ability to scheduled state transitions for content.
  3. Feature parity between Workbench Moderation and Content Moderation.
Tryclyde - the three-headed CM migration beastThe actual migration

Content Moderation was not a direct port of Workbench Moderation. It introduced the concept of Workflows which abstracts states and transitions from Content Moderation. As a result, the states and transitions that users had defined in WBM might not easily map to Workflows - especially if different content types have different states available.

To work around this, the migrator generates a hash of all available states per content type; then groups content types with identical hashes into Workflows. As an example, a site with the following content types and states would result in three Workflows as indicated by color:

WMB states/transition mapping to Workflows

The second half of the migration was making sure all existing content retained the correct state. Early prototypes used the batch API to process states, but this quickly because unscalable. In the end, we used the Migrate module to:

  1. Store the states of all entities and then remove them from the entities themselves.
  2. Uninstall Workbench Moderation and install Workflows + Content Moderation.
  3. Map the stored states back to their original entities as Content Moderation fields.

Note: This section of Lightning migration was made available as the contrib module WBM2CM. The rest of the migration is Lightning-specific.

Other Lightning Workflow features

Lightning Workflow does more than just provide states. Among other things, it also allows users to schedule state transitions. We have used the Scheduled Updates module for this since its introduction. Unfortunately, Scheduled Updates won't work with the computed field that is provided by Content Moderation. As a result, we ended up building a scheduler into Lightning Workflow itself.

Scheduled Updates is still appropriate and recommended for more complex scheduling - like for body fields or taxonomy term names. But for the basic content state transitions (i.e., publish this on datetime) you can use native Lightning Workflow.

As an added bonus, we sidestep a nasty translation bug (feature request?) that has been giving us problems with Scheduled Updates.

Feature parity

While Workflows is marked as stable in Core, Content Moderation is still in beta. This is partially because it's still missing some key features and integrations that Lightning uses. Specifically, Lightning has brought in patches and additional code so that we can have basic integration between Content Moderation ↔ Views and Content Moderation ↔ Quick Edit.

Want to try it out?

Assuming a standard Composer setup, you can update to the latest Lightning with the following. The migration is included in Lightning 2.2.4 and above:

$ composer update acquia/lightning --with-dependencies

Once you have updated your code, you can have Lightning automatically apply all pending updates, including the Content Moderation migration with the following (recommended):

$ /path/to/console/drupal update:lightning --no-interaction

Or you can just enable the WBM2CM module manually and trigger the migration with:

$ drush wbm2cm-migrate


Studie: Informatik & Co. gelten bei jungen Frauen als zu wenig kreativ

heise online Newsticker - 4. Dezember 2017 - 18:30
Eine europaweite Umfrage unter Mädchen und Frauen zwischen 11 und 30 Jahren hat ergeben, dass sich vor allem Kreative für die sogenannten MINT-Fächer interessieren. Ab 15 setze sich bei vielen aber die Meinung durch, Informatik sei dröge.

BlackBerry Motion im Test: hohe Verarbeitungsqualität, gute Business-Tools

heise online Newsticker - 4. Dezember 2017 - 18:30
Das BlackBerry Motion ist das zweite Android-Smartphone des Herstellers BlackBerry Mobile. Anders als das KEYone, besitzt es keine physische Tastatur. Hat es dann eine Daseinsberechtigung? TechStage hat den Test gemacht.

Forensiker: Sicherheit von iOS 11 hängt an seidenem Faden

heise online Newsticker - 4. Dezember 2017 - 18:30
Der Zugangs-Code des iPhones dient in iOS 11 als Generalschlüssel, moniert ein Forensik-Software-Hersteller. Wer den Code kennt, kann ungehindert sämtliche Daten auslesen und trotz Zwei-Faktor-Schutz den iCloud-Account kapern.

Drunter statt drüber: Kriminelle nutzen Pop-under fürs heimliche Krypto-Mining

heise online Newsticker - 4. Dezember 2017 - 18:00
Eine von Sicherheitsforschern entdeckte Krypto-Mining-Strategie schiebt Nutzern versteckte Browserfenster unter. Mit ihnen lässt sich klammheimlich noch mehr virtuelles Geld scheffeln.

Mediacurrent: Annotate to Communicate

Planet Drupal - 4. Dezember 2017 - 17:34

Someone once said, “if you have to explain the joke, it takes the fun out of it.” Well, the same can be said for designing a website. Explaining the important and sometimes technical details can be a tedious process many designers would avoid if possible. But when it comes to communicating the form and function of the user experience through wireframes, explaining each element can make or break the project. It’s always a good idea to include annotations.

TV- und Streaming-Tipps: Star Trek, King Kong, deutsche Hacker

heise online Newsticker - 4. Dezember 2017 - 17:30
Aus dem Überangebot des Free-TVs und der zahlreichen Streaming-Anbieter haben wir eine Handvoll Nerd-tauglicher Sendungen herausgesucht, die in der kommenden Woche laufen.

Bundesnetzagentur: Preise für Layer 2 Bitstromzugang bleiben stabil

heise online Newsticker - 4. Dezember 2017 - 17:00
Die Regulierungsbehörde senkt die Entgelte für den Wettbewerberzugang in Vectoring-Bereichen nur für Bandbreiten bis 50 MBit/s leicht ab, vor allem der Preis für schnellere VDSL-Zugänge soll bleiben.

LakeDrops Drupal Consulting, Development and Hosting: Welcome Matthias

Planet Drupal - 4. Dezember 2017 - 16:50
Welcome Matthias Jürgen Haas Mon, 12/04/2017 - 15:50

We are so glad to announce that Matthias Walti decided to join LakeDrops. He brings skills and experience in building e-commerce solutions, is a user experience expert and is well known for writing great content which is driven by his marketing background.

Doom VFR angespielt: Mit Tempo in die VR-Hölle

heise online Newsticker - 4. Dezember 2017 - 16:30
Das neue Doom VFR präsentiert sich auf Playstation VR und HTC Vive höchst unterschiedlich. Während Konsolenkrieger am Gamepad makellosen Stellungswechsel erleben, kämpfen Vive-Veteranen mit Desorientierung.

In-Skill-Käufe: Mit Alexa-Skills mehr Geld verdienen

heise online Newsticker - 4. Dezember 2017 - 16:30
Mit Alexa-Skills lässt sich künftig mehr Geld verdienen: Dank einer neuen In-Skill-Zahlungsfunktion können die Anbieter etwa zusätzliche Premium-Inhalte verkaufen. Prime-Kunden sollen Rabatte bekommen.

Doom VFR: Hohe Systemvoraussetzungen, Grafiktreiber GeForce 388.43 und AMD Crimson 17.11.4 notwendig

heise online Newsticker - 4. Dezember 2017 - 16:30
Die PC-Version von Doom VFR braucht dicke Grafik-Hardware und aktuelle Treiber, damit es Spielern in der VR-Hölle vor lauter Ruckeln nicht übel wird.

Erste Studie zur deutschen Games-Branche: Enormes Potenzial, zu wenig Förderung

heise online Newsticker - 4. Dezember 2017 - 16:30
Die Videospielbranche in Deutschland erwirtschaftet mehr als die Musik- oder Filmindustrie, hat aber ein noch viel größeres Potenzial. Das geht aus einer ausführlichen Studie hervor, deren Autoren deswegen für mehr Förderung plädieren.