I focussed this week on Webpack and some React-y things. Lots of learning going on over here.

#1: Lazy Loading with Webpack (follow up to code splitting)

only load code when the user requires it - speeds up initial loading of the app (some blocks may never be loaded)

button.onclick = e => import(/* webpackChunkName: "print" */ './print').then(module => {
  var print = module.default;
  print();
});

Note: when using import() on ES6 modules, you must reference the .default property as it’s the actual module object that will be returned when the promise is resolved

#2: React and Code Splitting

Incrementally downloading the app
- packages: webpack, babel-plugin-syntax-dynamic-import and react-loadable

babel-plugin-syntax-dynamic-import
- syntax-only plugin meaning babel won’t do any transformation
- allows babel to parse dynamic imports so webpack can bundle them as a code split

react-loadable
- higher-order component for loading components with dynamic imports
- makes code splitting easy!

import Loadable from 'react-loadable';
import Loading from './Loading';

const LoadableComponent = Loadable({
  loader: () => import('./Dashboard'),
  loading: Loading,
})

export default class LoadableDashboard extends React.Component {
  render() {
    return <LoadableComponent />;
  }
}

Then use LoadableDashboard and it will automatically be loaded and rendered when you use it in your application
- loading is a placeholder component to show while the real component is loading

react-loadable also has suggestions for server-side rendering

#3: File Bundling and HTTP/2: Rethinking Best Practices

HTTP/2: based on Google’s SPDY protocol - intent is to improve page load latency and security
- binary protocol not text based (more compact, efficient to parse and less prone to errors)
- multiplexed: multiple files can be transferred on a single connection
- server push: allows the server to transfer resources to the client before they’re requested (pre-filling the cache

HTTP/2 and JS developers
- concatenating multiple files into bundles makes it difficult for the browser to effectively cache our code
- the whole bundle needs to be redownloaded if one line of code changes
- since HTTP/2 can multiplex (making requests inexpensive) we can split code into smaller bundles and make use of caching (better experience for users)

web servers also have limits on how efficiently they can serve a large number of files (so we shouldn’t endlessly split files)

#4 webpack and HTTP/2

  • Still is protocol overhead for each request compared to a single concatenated file
  • The compression of the single large file is better than many small files
  • Servers are slower serving many small files than a single large file

Changing one module invalidates the cache for one bundle which is only a part of the complete application - the remaining application is still cached. Need to find a balance.

More bundles = better caching but less compression

AggressiveSplittingPlugin (from webpack)
- Splits the original chunks into smaller chunks (you specify the size)
- To combine similar modules, they are sorted alphabetically (by path) before splitting - modules in the same folder are probably related to each other and similar from compression point of view - with this sorting they end up in the same chunk

We need to reuse the previously created chunks
- When AgressiveSplittingPlugin finds a good chunk, it stores the chunk’s modules and has into records (web pack’s concept of state that is kept between compilations)
- AggressiveSplittingPlugin tries to restore the chunks from records before trying to spit the remaining modules (ensures reuse)

The application using this optimization will have multiple script tags to load each chunk in parallel
- The browser can start executing older files in cache while waiting for the download of most recent files
- HTTP/2 Server push can be used to send these chunks to the client when the HTML page is requested - best to start pushing the most recent file first as older files are more likely already in the cache
- The client can cancel push responses for files it already has, but this take a round trip
- When using code splitting for on demand loading, w ebpack handles the parallel requests for you

#5: 6 Reasons Why JavaScript’s Async/Await Blows Promises Away

Async/await makes asynchronous code look and behave a little more like synchronous code. Any async function returns a promise implicitly and the resolve value of the promise will be whatever you return from the function

Why is it better?
1. Concise + clean (no more .then, no more nested code)
2. Error Handling: can handle synchronous and async errors with the same construct (try/catch) - previously, try/catch won’t handle errors inside the promise
3. Conditionals: simplifies cause it makes multiple promises easier
4. Intermediate Values: no more crazy nesting promises or using promise.all
5. Error stacks: especially in production environments with large promise chains
6. Debugging: much easier - promises were annoying because you couldn’t set breakpoints in arrow functions that return expressions (no body)
- Doesn’t step through .then statements because it only goes through synchronous code
- With async/await you can step through like its synchronous

Harder things:
- More difficult to spot asynchronous code