<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Thoughts, stories and ideas. | Doug Niccum Design]]></title><description><![CDATA[Welcome to the personal blog of web/application developer Doug Niccum. I develop in just about anything Javascript, PHP, and  mobile application platforms.]]></description><link>https://blog.dniccumdesign.com/</link><generator>Ghost 3.13</generator><lastBuildDate>Tue, 27 Jan 2026 20:32:14 GMT</lastBuildDate><atom:link href="https://blog.dniccumdesign.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Passing uploaded files between Laravel Livewire components]]></title><description><![CDATA[<p>I recently discovered a need to be able pass temporary (without being saved to your filesystem) uploaded files using Laravel Livewire for an API that I am building for my 9-5 job. Essentially (thinking with my VueJS cap) I wanted to build a uniform file-upload input that I could use</p>]]></description><link>https://blog.dniccumdesign.com/passing-uploaded-files-between-laravel-livewire-components/</link><guid isPermaLink="false">5fa5543d32e37540c4e8c82e</guid><category><![CDATA[laravel]]></category><category><![CDATA[tutorial]]></category><category><![CDATA[development]]></category><category><![CDATA[tailwind]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Fri, 06 Nov 2020 14:39:17 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1526112562240-f3c31a27a110?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1526112562240-f3c31a27a110?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Passing uploaded files between Laravel Livewire components"><p>I recently discovered a need to be able pass temporary (without being saved to your filesystem) uploaded files using Laravel Livewire for an API that I am building for my 9-5 job. Essentially (thinking with my VueJS cap) I wanted to build a uniform file-upload input that I could use multiple times within my application.</p><p>Again, using my VueJS-centered brain, I just assumed that I would be able to simply emit up the <code>TemporaryUploadedFile</code> class that Livewire generates with a file-upload. Naturally I was incorrect. <strong>My problem:</strong> when you attempt to pass this class to a parent/child component all you get is:</p><figure class="kg-card kg-code-card"><pre><code class="language-php">[
    'disk' =&gt; 'public'
];</code></pre><figcaption>The dumped response from a emitted temporary file</figcaption></figure><p>Obviously you cannot do anything with this. So after some discovery, I was able to come up with a scalable solution.</p><h2 id="the-result">The result</h2><p>For the sake of being thorough, I am going to show you all of my code, so you can replicate the entire component for yourself. It is worth noting that I also used Alpine.JS and Tailwind for this component.</p><p>In the next few steps, this is what we are going to build:</p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://blog.dniccumdesign.com/content/images/2020/11/Screen-Shot-2020-11-06-at-8.07.54-AM.png" width="878" height="352" alt="Passing uploaded files between Laravel Livewire components"></div><div class="kg-gallery-image"><img src="https://blog.dniccumdesign.com/content/images/2020/11/Screen-Shot-2020-11-06-at-8.08.00-AM.png" width="862" height="342" alt="Passing uploaded files between Laravel Livewire components"></div><div class="kg-gallery-image"><img src="https://blog.dniccumdesign.com/content/images/2020/11/Screen-Shot-2020-11-06-at-8.08.06-AM.png" width="876" height="346" alt="Passing uploaded files between Laravel Livewire components"></div></div></div><figcaption>File upload steps</figcaption></figure><p>Yours <em>MIGHT</em> look a little different as I have modified my <code>tailwind.config.js</code> file a bit.</p><h2 id="how-i-did-it">How I did it</h2><p>Assuming you have at least skimmed the documentation for <a href="https://laravel-livewire.com/docs/2.x/file-uploads">Laravel Livewire on file uploads</a>, you can see that once a file is uploaded, Livewire stores the file in a temporary directory within your filesystem; whether it is remote or locally depends on your infrastructure. The response of this action is an object with a bunch of other meta data named <code>Livewire\TemporaryUploadedFile</code>. From there you can extract any information that you need and save/move the file around where you need it. Again, you can <a href="https://laravel-livewire.com/docs/2.x/file-uploads#storing-files">refer to the documentation</a> for further details.</p><p>Like I said above, if you simply try to use the <code>emit()</code> action to pass the result to other components, all that is passed is the disk path. I even tried JSON encoding the object, but that didn't work either. After dissecting the <code>Livewire\TemporaryUploadedFile</code> class, I discovered there is an exposed method <code>getRealPath()</code> that returns the temporary location of the file. Furthermore, I found that the path of the file is the parameter required to construct a new instance of the <code>TemporaryUploadedFile</code> class. Passing strings with Laravel Livewire is easy, so that was my ticket: emitting the full <em>real</em> path of the temporary file, and then simply re-constructing the <code>TemporaryUploadedFile</code> class in the parent/listening component.</p><p>Once finished, then you can either move/save the emitted file like so:</p><pre><code class="language-php">class FileUpload extends Component
{
    use WithFileUploads;

    public $file;
    
    ...
    
    public function fileComplete()
    {
        $this-file-&gt;storePubliclyAs(
            'documents',
            'sample',
            config('filesystems.default'),
        );
    }
}</code></pre><h2 id="full-code">Full code</h2><h3 id="child-component-template">Child Component Template</h3><pre><code class="language-html">&lt;div
    x-data="{ isUploading: false, progress: 0, name: null }"
    x-on:livewire-upload-start="isUploading = true"
    x-on:livewire-upload-finish="isUploading = false; $wire.fileComplete()"
    x-on:livewire-upload-error="isUploading = false"
    x-on:livewire-upload-progress="progress = $event.detail.progress"
&gt;

    &lt;div class="overflow-hidden relative w-64 mt-4 mb-4"&gt;
        &lt;label class="font-sans py-2 px-4 border border-transparent text-sm font-semibold tracking-wider rounded-md transition duration-150 ease-in-out uppercase border-gray-300 text-gray-700 hover:text-gray-500 focus:outline-none focus:border-blue-300 focus:shadow-outline-blue inline-flex cursor-pointer" x-show="!name"&gt;
            &lt;svg fill="#FFF" height="18" viewBox="0 0 24 24" width="18" xmlns="http://www.w3.org/2000/svg"&gt;
                &lt;path d="M0 0h24v24H0z" fill="none"/&gt;
                &lt;path d="M9 16h6v-6h4l-7-7-7 7h4zm-4 2h14v2H5z"/&gt;
            &lt;/svg&gt;
            &lt;span class="ml-2"&gt;Select File&lt;/span&gt;
            &lt;input class="hidden" type="file" wire:model="file" x-on:change="name = $event.target.files[0].name"&gt;
        &lt;/label&gt;
        &lt;div class="text-gray-500" x-show="name"&gt;
            &lt;span x-show="isUploading"&gt;Uploading... Please wait.&lt;/span&gt;
            &lt;div class="flex items-center" x-show="!isUploading"&gt;
                &lt;div class="flex-1 truncate mr-2"&gt;
                    &lt;span x-text="name"&gt;&lt;/span&gt;
                &lt;/div&gt;
                &lt;button type="button" class="inline-flex items-center justify-center h-7 w-7 rounded-full bg-red-600 hover:bg-red-800 text-white shadow-lg hover:shadow-xl transition duration-150 ease-in-out focus:bg-red-700 outline-none focus:outline-none" x-on:click="progress = 0; name = null; $wire.file = null; $wire.fileReset()"&gt;
                    &lt;svg class="h-5 w-5" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" fill="currentColor"&gt;
                    &lt;path fill-rule="evenodd" d="M9 2a1 1 0 00-.894.553L7.382 4H4a1 1 0 000 2v10a2 2 0 002 2h8a2 2 0 002-2V6a1 1 0 100-2h-3.382l-.724-1.447A1 1 0 0011 2H9zM7 8a1 1 0 012 0v6a1 1 0 11-2 0V8zm5-1a1 1 0 00-1 1v6a1 1 0 102 0V8a1 1 0 00-1-1z" clip-rule="evenodd" /&gt;
                &lt;/svg&gt;
                &lt;/button&gt;
            &lt;/div&gt;
        &lt;/div&gt;
    &lt;/div&gt;

    &lt;!-- Progress Bar --&gt;
    &lt;div class="relative pt-1" x-show="isUploading"&gt;
        &lt;div class="overflow-hidden h-2 mb-4 text-xs flex rounded bg-green-200"&gt;
            &lt;div :style="`width: ${progress}%`" class="shadow-none flex flex-col text-center whitespace-nowrap text-white justify-center bg-green-500"&gt;&lt;/div&gt;
        &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;</code></pre><h3 id="child-component">Child Component</h3><pre><code class="language-php">&lt;?php

namespace App\Http\Livewire;

use Livewire\Component;
use Livewire\WithFileUploads;

class FileUpload extends Component
{
    use WithFileUploads;

    public $file;

    public function render()
    {
        return view('livewire.file-upload');
    }

    public function updatedFile()
    {
        $this-&gt;validate([
            'file' =&gt; 'file|max:8192', // 8MB Max
        ]);

        $this-&gt;emitUp('fileUploaded', $this-&gt;file);
    }

    public function fileComplete()
    {
        $this-&gt;emitUp('fileUpload', $this-&gt;file-&gt;getRealPath());
    }

    public function fileReset()
    {
        $this-&gt;emitUp('fileReset');
    }
}
</code></pre><h3 id="parent-component">Parent Component</h3><pre><code class="language-php">&lt;?php

namespace App\Http\Livewire;

use Livewire\Component;
use Livewire\TemporaryUploadedFile;

class ParentComponent extends Component
{
    protected $listeners = [
        'fileUpload',
        'fileReset',
    ];
    
    public $file;
    
    public function fileUpload($payload)
    {
        $this-&gt;file = new TemporaryUploadedFile($payload, config('filesystems.default'));
        
        /* When you are ready to store the file */
        /*
        $this-&gt;file-&gt;storePubliclyAs(
            'documents',
            'sample',
            config('filesystems.default'),
        );
        */
    }

    public function fileReset()
    {
        $this-&gt;file = null;
    }
}</code></pre><p>Hopefully you can find this useful. I was banging my head against the wall for a couple hours with this, and naturally the solution was easier than anticipated.</p><!--kg-card-begin: html--><div style="width:100%;height:0;padding-bottom:76%;position:relative;"><iframe src="https://giphy.com/embed/xT5LMzIK1AdZJ4cYW4" width="100%" height="100%" style="position:absolute" frameborder="0" class="giphy-embed" allowfullscreen></iframe></div><p><a href="https://giphy.com/gifs/thesimpsons-the-simpsons-3x14-xT5LMzIK1AdZJ4cYW4">via GIPHY</a></p><!--kg-card-end: html-->]]></content:encoded></item><item><title><![CDATA[Using Laravel Mix, Tailwind and PurgeCSS to build a Grav theme]]></title><description><![CDATA[If you are around the PHP and Open Source community, you have probably at least heard of the utility-based CSS framework: Tailwind. The ability for developers of varying frontend chops to build custom user interfaces with little to no CSS is essentially unparalleled. ]]></description><link>https://blog.dniccumdesign.com/using-tailwind-and-purgecss-with-grav/</link><guid isPermaLink="false">5f6215b132e37540c4e8c697</guid><category><![CDATA[tailwind]]></category><category><![CDATA[CSS]]></category><category><![CDATA[Grav]]></category><category><![CDATA[laravel]]></category><category><![CDATA[tutorial]]></category><category><![CDATA[case study]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Wed, 16 Sep 2020 22:13:23 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1497343369881-b74d56186549?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1497343369881-b74d56186549?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Using Laravel Mix, Tailwind and PurgeCSS to build a Grav theme"><p>If you are around the PHP and Open Source community, you have probably at least heard of the utility-based CSS framework: <a href="https://tailwindcss.com/">Tailwind</a>. The ability for developers with varying degrees of frontend chops to build custom user interfaces with little to no CSS is essentially unparalleled. In the off-chance you haven't, Tailwind describes themselves as:</p><blockquote>Tailwind CSS is a highly customizable, low-level CSS framework that gives you all of the building blocks you need to build bespoke designs without any annoying opinionated styles you have to fight to override.<br>...<br>Instead of opinionated predesigned components, Tailwind provides low-level utility classes that let you build completely custom designs without ever leaving your HTML.</blockquote><p>Being an application developer myself, you can probably understand why I am enamored with it. However, I will admit that it <em>does have a bit of a learning curve</em>. A good portion of it has to do with the initial setup and the recommended use of <a href="https://purgecss.com/">PurgeCSS</a>; because of the <em>extremely</em><strong> </strong>beefy (2380.4kB) base CSS file  that Tailwind ships with. So again, it is highly recommended that you use a post-css tool like PurgeCSS to remove unused classes and utilities from your builds.</p><h2 id="selecting-a-build-tool">Selecting a build tool</h2><p>Despite all of my experience in the web/application/mobile development scenes, I will be the first to admit that I am NOT a <a href="https://webpack.js.org/">Webpack</a> expert. I know just enough to be effective but I do not know all of the ins and outs. Just like Tailwind, it can take a bit of work to get it to behave the way you want. So naturally I began looking for another option.</p><p>In case you didn't know, I am a Laravel application developer by trade. I love just about everything about the Laravel ecosystem. Yep, call me a fan boy:</p><figure class="kg-card kg-embed-card"><iframe src="https://tenor.com/embed/5406366" width="600" height="400" frameborder="0"></iframe></figure><p>One of the items that I absolutely love is <a href="https://laravel-mix.com/">Laravel Mix</a>. This Node module is essentially a Webpack wrapper that makes Webpack far more accessible and cleaner with an optimized structure/API. As it turns out, you can use Laravel's front end build tool, Mix, <a href="https://laravel-mix.com/docs/5.0/installation#stand-alone-project">in other non-Laravel environments.</a> Golden.</p><h2 id="building-a-website-for-techy-people">Building a website for "techy" people</h2><p>The whole inspiration for this post came from the need for a co-worker and I to build a simple marketing landing page for an application that we have been working on. Since this was intended to be a single page, we didn't need a fully baked CMS, but we also wanted to have the ability to update some content/images/etc without code changes.</p><p>My recommendation was to use <a href="https://getgrav.org/">Grav</a>. This CMS, in my opinion, is one of the better lightweight and flat file CMS' out there. It does not require a lot of server resources (namely a database) to run and can be saved via version control. With all of that said, it <em>can</em> have a bit of learning curve and may not be super easy to use for people who do not have a lot of CMS experience. That's why I do not heavily recommend this CMS to all my clients. Still, there is a ton that you can do with it. This is because it relies pretty heavily on YAML and Markdown; making Grav a perfect solution for "technologically inclined" people.</p><h2 id="the-build">The build</h2><p>So we decided to use Grav and to build a theme for it, using Tailwind. The Grav documentation is pretty good, but they do not really include a bunch of information how to use SCSS or any other type of pre/post processing in the building of your theme. So what you see below is the result of my trial and error.</p><h3 id="1-set-up-grav">1. Set up Grav</h3><p>There are a couple ways to install Grav. I will <a href="https://learn.getgrav.org/16/basics/installation">send you over to their docs</a> so you can follow them. No sense in duplicate content.</p><h3 id="2-create-theme">2. Create theme</h3><p>First things first, we need to install the Grav devtools. This can be used via the Grav Package Manager (GPM). Run this command within your Grav directory:</p><pre><code class="language-bash">bin/gpm install devtools</code></pre><p><em>Note:</em> This does not need to be install globally. The GPM is part of any installation. </p><p>Next lets generate the theme:</p><pre><code class="language-bash">bin/plugin devtools new-theme</code></pre><p>This process will ask you a few questions that are required to create the new theme and will compile them into your theme's YAML configuration file. Activate the new theme in the system configuration and we are good to go</p><h3 id="3-mix-and-other-dependencies">3. Mix and other dependencies</h3><p>Now that the theme has been created, let's start installing some stuff. Within your new theme directory, create a <code>package.json</code> file in the theme's root directory with the following content:</p><figure class="kg-card kg-code-card"><pre><code class="language-json">{
    "private": true,
    "scripts": {
        "dev": "npm run development",
        "development": "cross-env NODE_ENV=development node_modules/webpack/bin/webpack.js --progress --hide-modules --config=node_modules/laravel-mix/setup/webpack.config.js",
        "watch": "npm run development -- --watch",
        "watch-poll": "npm run watch -- --watch-poll",
        "hot": "cross-env NODE_ENV=development node_modules/webpack-dev-server/bin/webpack-dev-server.js --inline --hot --disable-host-check --config=node_modules/laravel-mix/setup/webpack.config.js",
        "prod": "npm run production",
        "production": "cross-env NODE_ENV=production node_modules/webpack/bin/webpack.js --no-progress --hide-modules --config=node_modules/laravel-mix/setup/webpack.config.js"
    },
    "dependencies": {
        "tailwindcss": "^1.6.2"
    },
    "devDependencies": {
        "cross-env": "^7.0.2",
        "laravel-mix": "^5.0.4",
        "laravel-mix-purgecss": "^5.0.0",
        "node-sass": "^4.14.1",
        "sass-loader": "^8.0.2"
    }
}
</code></pre><figcaption>If you don't plan on using SCSS/SASS feel free to leave off <code>node-sass</code> and <code>sass-loader</code>.&nbsp;</figcaption></figure><p>Next in the same location create a <code>webpack.mix.js</code> file with:</p><figure class="kg-card kg-code-card"><pre><code class="language-javascript">const mix = require('laravel-mix')
const tailwindcss = require('tailwindcss')
require('laravel-mix-purgecss');

mix.sass('scss/app.scss', 'css/custom.css')
    .sourceMaps()
    .options({
        processCssUrls: false,
        postCss: [
            tailwindcss('tailwind.config.js')
        ],
    })
    .purgeCss({
        enabled: mix.inProduction(),
        content: [
            `./templates/**/*.twig`,
            `./scss/**/*.scss`
        ],
        folders: ['js', 'scss', 'templates'],
        extensions: ['html', 'js', 'twig', 'scss'],
        whitelistPatterns: [
            //
        ],
    });</code></pre><figcaption>Again, if you are not planning on using SCSS/SASS change: <code>mix.sass('scss/app.scss', 'css/custom.css')</code> to <code>mix.css('css/app.css', 'css/custom.css')</code></figcaption></figure><h3 id="4-setting-up-tailwind">4. Setting up Tailwind</h3><p>Im sure you noticed in the mention of tailwind via <code>tailwindcss('tailwind.config.js')</code>. Until now we haven't created it yet. Let's do that now. In the same directory as the <code>package.json</code> file, create <code>tailwind.config.js</code> with the following content:</p><pre><code class="language-javascript">const defaultTheme = require('tailwindcss/defaultTheme')

module.exports = {
  purge: [],
  theme: {
    extend: {},
  },
  variants: {},
  plugins: [],
}
</code></pre><p>I won't go into much detail about how to configure Tailwind as it is pretty well <a href="https://tailwindcss.com/docs/installation">spelled out in their documentation</a>. This is all we need to get started at least.</p><h3 id="5-modifying-the-templates">5. Modifying the templates</h3><p>Up until now we have been creating the necessary files and workflows to generate our CSS to render in our theme, but we haven't actually done anything with our theme yet. When you created the new theme, a <code>./templates/partials/base.html.twig</code> file should have been generated. You should see something like the following within the <code>&lt;head&gt;</code> tag:</p><pre><code class="language-HTML">{% set theme_config = attribute(config.themes, config.system.pages.theme) %}
&lt;!DOCTYPE html&gt;
&lt;html lang="{{ grav.language.getActive ?: grav.config.site.default_lang }}"&gt;
&lt;head&gt;
{% block head %}
    &lt;meta charset="utf-8" /&gt;
    &lt;title&gt;{% if header.title %}{{ header.title|e('html') }} | {% endif %}{{ site.title|e('html') }}&lt;/title&gt;

    &lt;meta http-equiv="X-UA-Compatible" content="IE=edge"&gt;
    &lt;meta name="viewport" content="width=device-width, initial-scale=1"&gt;
    {% include 'partials/metadata.html.twig' %}

    &lt;link rel="icon" type="image/png" href="{{ url('theme://images/logo.png') }}" /&gt;
    &lt;link rel="canonical" href="{{ page.url(true, true) }}" /&gt;
{% endblock head %}

{% block stylesheets %}
    {% do assets.addCss('https://unpkg.com/purecss@1.0.0/build/pure-min.css', 100) %}
    {% do assets.addCss('https://maxcdn.bootstrapcdn.com/font-awesome/4.7.0/css/font-awesome.min.css', 99) %}
{% endblock %}

{% block javascripts %}
    {% do assets.addJs('jquery', 100) %}
{% endblock %}

{% block assets deferred %}
    {{ assets.css()|raw }}
    {{ assets.js()|raw }}
{% endblock %}
&lt;/head&gt;</code></pre><p>I want to direct your attention to the <code>{% block stylesheets %}</code> tag. Within there are a few references to some css files. The numbers referenced, 100 and 99 respectively, are the order in which these files will be minified during run time. Since we will be using a custom built stylesheet utilizing Tailwind and PurgeCSS, we can now update the <code>{% block stylesheets %}</code> tag to show the following:</p><pre><code class="language-HTML">{% block stylesheets %}
    {% do assets.addCss('theme://css/custom.css', 100) %}
{% endblock %}</code></pre><h3 id="6-compiling-tailwind">6. Compiling Tailwind</h3><p>All that is left to do is build our CSS. If we use <code>npm run dev</code> or <code>yarn dev</code>, you will see a relatively large file being output to <code>css/custom.css</code>. That is because PurgeCSS will <em>only run</em> when Webpack is in production mode. If you use <code>npm run prod</code> or <code>yarn prod</code>, you will notice that the CSS file is exponentially smaller. That is because PurgeCSS will comb through all of your project's files and look for the use of Tailwind's CSS. If it is not being used, it will be purged; effectively making your website's assets much lighter weight.</p><h2 id="that-s-it">That's it</h2><p>Hopefully you found some of this enlightening. Tailwind has revolutionized how I develop and code; regardless if it is an application or website. Hopefully it will become useful for you as well.</p>]]></content:encoded></item><item><title><![CDATA[Modify image permissions on S3 file uploads within Laravel Nova]]></title><description><![CDATA[Using a PHP invokable class, you can manipulate/change anything about your Laravel Nova image/file uploads.]]></description><link>https://blog.dniccumdesign.com/modify-image-permissions-on-s3-file-uploads-within-laravel-nova/</link><guid isPermaLink="false">5f21e6d532e37540c4e8c5d8</guid><category><![CDATA[laravel]]></category><category><![CDATA[php]]></category><category><![CDATA[aws]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Wed, 29 Jul 2020 21:58:01 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1514446975754-a97fb2c05237?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1514446975754-a97fb2c05237?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Modify image permissions on S3 file uploads within Laravel Nova"><p>If you are a Laravel aficionado like myself, you like just about everything that Taylor Otwell and team cranks out. Those guys over there develop some very good platforms with extraordinary design and support. This includes their administrative dashboard that they released a couple years ago at Laracon: <a href="https://nova.laravel.com">Laravel Nova</a>.</p><p>If you aren't familiar with Laravel Nova, it is an administrator dashboard that automatically hooks in to your application's model structure and provides a phenomenal PHP-based API for you to configure the fields and their behavior; not to mention that it can be extended via third-party plugins and extensions.</p><p>I happen to have written a few of these extensions myself, <a href="https://novapackages.com/collaborators/dniccum">so check them out</a> if you are interested. A shameless plug; <em>I KNOW.</em></p><h2 id="going-public">Going public</h2><p>While using another one of Laravel's first-party platforms, <a href="https://vapor.laravel.com">Laravel Vapor</a>, I ran into an issue where by default, all file uploads from Laravel Nova are being made private within an S3 bucket. This is <em>extremely</em> inconvenient because it either forces you to manually change the access permissions of the uploaded object or the bucket itself, or <a href="https://laravel.com/docs/7.x/filesystem#file-urls">use a temporary signed-url</a>. If you decide to go with the second option, this can cause it's own set of issues within Nova. I really struggled with this and for the life of me could not find a way around this.</p><h2 id="using-invokables">Using invokables</h2><p>After doing some digging I stumbled upon a section within the <a href="https://nova.laravel.com/docs/2.0/resources/file-fields.html#invokables">Nova documentation</a> where you can use invokable classes to manipulate the <code>Image</code> and <code>File</code> Nova input facades. What they essentially let you do is pass a series of parameters into an invoked class, run a bunch of logic, and then return a single/series of values to be saved into the database.</p><p>Admittedly there is <em>a lot</em> of rather confusing stuff in this section of the docs, but with a little trial and error I was <strong>finally </strong>able to solve my issue: upload images from Laravel Nova into S3 with public permissions while not manipulating the object at the bucket level. See below.</p><h2 id="how-to">How To</h2><p>First create a class within your application; the name and associated namespace doesn't mater, and add the following code:</p><pre><code class="language-php">&lt;?php


namespace App\Library\Contracts;

use Illuminate\Http\Request;


class NovaImageUpload
{
    /**
     * Store the incoming file upload.
     *
     * @param  \Illuminate\Http\Request  $request
     * @param  \Illuminate\Database\Eloquent\Model  $model
     * @param  string  $attribute
     * @param  string  $requestAttribute
     * @param  string  $disk
     * @param  string  $storagePath
     * @return array
     */
    public function __invoke(Request $request, $model, $attribute, $requestAttribute, $disk, $storagePath)
    {
        $currentFile = $request-&gt;file($attribute);

        return [
            $attribute =&gt; $currentFile-&gt;storePublicly($storagePath)
        ];
    }
}</code></pre><p>Essentially what this code does is take a series of parameters and allow you to sort through them fairly easily and fully customize the upload experience. While this code sample is pretty sample, hopefully you can see how much power you actually have. Just a small note: the <code>$attribute</code> variable is the property of the model that will store the file path, and the <code>$storagePath</code> is the directory that the file will be stored in; which for me was <code>/</code>.</p><p>Next we need to add the following lines to your Nova resource:</p><pre><code class="language-php">use App\Library\Contracts\NovaImageUpload;

Image::make('Image', 'product_image')
// code to add
    -&gt;store(new NovaImageUpload);
////</code></pre><p>Using the <code>store</code> method, you pass the invokable function to hijack the file upload process and then use the functionality that you outlined in the invoked class. It looks kind of complicated on the surface, but once you figure out what you are doing, it is kind of empowering knowing how/where you stuff is being manipulated.</p><p>Hope this helps!</p>]]></content:encoded></item><item><title><![CDATA[Add HTTPS redirects to your ApostropheCMS application on Heroku]]></title><description><![CDATA[<p>When you want to redirect/force an application's routes to use the HTTPS protocol, stereotypically you use the web server (NGINX, Apache, IIS, etc) to force the application to use a specific protocol and then resolve the routes that way. But what happens when you use a cloud based service</p>]]></description><link>https://blog.dniccumdesign.com/add-https-redirects-to-your-apostrophecms-application-on-heroku/</link><guid isPermaLink="false">5ec2c6ab32e37540c4e8c538</guid><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Mon, 18 May 2020 17:58:26 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1582970816926-c8b60f417661?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1582970816926-c8b60f417661?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Add HTTPS redirects to your ApostropheCMS application on Heroku"><p>When you want to redirect/force an application's routes to use the HTTPS protocol, stereotypically you use the web server (NGINX, Apache, IIS, etc) to force the application to use a specific protocol and then resolve the routes that way. But what happens when you use a cloud based service like Heroku? What's more, what happens when you use NodeJS where the server's process is pointed at a specific file and not a directory; ultimately eliminating the use of the <code>.htaccess</code> file?</p><p>I recently ran across such an issue when deploying an <a href="https://apostrophecms.com/">ApostropheCMS</a> website to Heroku. Since you are reading this article I will assume that you know what Apostrophe CMS is (if not, it's a NodeJS-powered enterprise CMS with a WYSIWYG/GUI editing interface like that of Sitecore/Adobe Experience Manager). Heroku offers free SSL certificates with an application's custom domains, but how do you redirect all requests to use the HTTPS protocol?</p><h2 id="step-1">Step 1</h2><p>First you need to be on a paid Heroku dyno that starts around $7 per-month.</p><h2 id="step-2">Step 2</h2><p>We will be using the <code><a href="https://www.npmjs.com/package/heroku-ssl-redirect">heroku-ssl-redirect</a></code> plugin. Add it to your ApostropheCMS like so:</p><pre><code class="language-bash">yarn add heroku-ssl-redirect</code></pre><p>or</p><pre><code class="language-bash">npm install heroku-ssl-redirect --save</code></pre><h2 id="step-3">Step 3</h2><p>Here is where the real magic starts to happen. If you haven't already, we are going to create a <a href="https://docs.apostrophecms.org/core-concepts/global-settings/settings.html#user-editable-global-settings_">new <em>global</em> Apostrophe CMS module</a>. To do so, we can use the ApostropheCMS CLI like so:</p><pre><code class="language-bash">apos create-module apostrophe-global</code></pre><p>The result will be a new module directory and accompanying nested <code>index.js</code> created within the <code>modules</code> directory.</p><h2 id="step-4">Step 4</h2><p>Finally we need to configure the new global module that we just created:</p><pre><code class="language-javascript">var sslRedirect = require('heroku-ssl-redirect');

module.exports = {
   construct: function(self, options, callback) {
      self.apos.app.use(sslRedirect());

      callback();
   }
};</code></pre><p>So what we have done is coupled the <code>heroku-ssl-redirect</code> module to the base Express process that is running behind Apostrophe. This block of code will run on <em>every request.</em></p><p>In case you want to know what is going on behind this scenes, this is what the module is doing:</p><pre><code class="language-javascript">/**
* Force load with https on production environment
* https://devcenter.heroku.com/articles/http-routing#heroku-headers
*/
module.exports = function(environments, status) {
  environments = environments || ['production'];
  status = status || 302;
  return function(req, res, next) {
    if (environments.indexOf(process.env.NODE_ENV) &gt;= 0) {
      if (req.headers['x-forwarded-proto'] != 'https') {
        res.redirect(status, 'https://' + req.hostname + req.originalUrl);
      }
      else {
        next();
      }
    }
    else {
      next();
    }
  };
};</code></pre><p>As you can see, what we have done is basically attached a custom middleware to each request within Apostrophe; forcing every request to use the HTTPS protocol.</p><p>As I had issues trying to figure this out for myself, hopefully this will demystify ApostropheCMS a little more for you. Cheers.</p>]]></content:encoded></item><item><title><![CDATA[Dynamic Javascript components using Laravel Mix that are hosted remotely]]></title><description><![CDATA[After adding the support for dynamic imports to an application that I have developed for a client, I uncovered an issue with this feature within the application's production environment. ]]></description><link>https://blog.dniccumdesign.com/dynamic-vuejs-components-using-laravel-mix-that-are-hosted-remotely/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4bf</guid><category><![CDATA[javascript]]></category><category><![CDATA[laravel]]></category><category><![CDATA[tutorial]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Fri, 07 Jun 2019 15:14:45 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2019/06/john-torcasio-1501473-unsplash.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.dniccumdesign.com/content/images/2019/06/john-torcasio-1501473-unsplash.jpg" alt="Dynamic Javascript components using Laravel Mix that are hosted remotely"><p>If you are an avid reader of the <a href="https://laravel-news.com">Laravel News blog/newsletter</a> (like myself), you probably saw Jason Begg's post the other day titled <a href="https://laravel-news.com/using-dynamic-imports-with-laravel-mix"><em>Using Dynamic Imports with Laravel Mix</em></a><em>. </em>In case you didn't, the subject matter of this post touched on the new addition to Jeffery Way's <a href="https://github.com/JeffreyWay/laravel-mix">Laravel Mix</a> package that helps with asset compilation for the Laravel framework. More specifically, the official support for dynamic imports within Javascript components.</p><p>This has been a long awaited feature for developers like myself where your Webpack Javascript files are pretty large in size. The negative side effects associated with these bloated packages can be very impactful especially for users on the mobile platform; leading to longer load times, slower performance, lost users, etc. However with the addition of support for dynamic imports, these size difficulties could be a thing of the past.</p><h2 id="the-remote-connection">The remote connection</h2><p>After adding the support for dynamic imports to an application that I have developed for a client, I uncovered an issue with this feature within the application's production environment. This application is currently being hosted with <em><a href="https://www.fortrabbit.com">fortrabbit</a></em>, a managed cloud-based hosting platform that specializes in PHP applications and websites. <em>fortrabbit's </em>solution for Javascript, CSS, and image storage is a through an S3 bucket because developers do not have access to <em>fortrabbit's</em><strong> </strong>local file system.</p><p>The issue that I discovered is that by default Webpack assumes that all of the dynamic imports are stored locally with the application. However if you are using a hosting platform like that of <em>fortrabbit</em>, your dynamic imports will result in a fancy 404 error. After some research and a couple hours of trial and error, I was able to devise a somewhat simple solution to fix for those pesky 404 errors.</p><h2 id="adding-the-webpack-config">Adding the Webpack Config</h2><p>To start, we need to open your application's <code>webpack.mix.js</code> file and we need to add the following code:</p><pre><code>mix.webpackConfig({
    output: {
        publicPath: process.env.PUBLIC_PATH ? process.env.PUBLIC_PATH : '/',
        chunkFilename: 'build/js/[name].js',
    },
});</code></pre><p>First I want to call out the <code>chunkFilename</code> attribute. Within my application, I put all of the compiled Javascript and CSS in a build directory which is then ignored by version control. Feel free to change this path as needed; except for the <code>[name]</code> placeholder as this is required.</p><p>Second, let's look at the <code>publicPath</code> attribute. As you can see I have added a ternary if-statement that tests the existence of a <strong>PUBLIC_PATH </strong>environment variable. If such a variable can be found, Webpack tells the application to look for these imports at the variable's location/value. Otherwise the path defaults to the root public path; which supports all of my local development.</p><h2 id="building-the-javascript">Building the Javascript</h2><p>Now that we have told Webpack the public path that we want it to use when the application is in a non-local environment, we need to have Mix compile the Javascript and CSS. To do so I use the command below:</p><pre><code>PUBLIC_PATH=https://remote-path.com/ npm run prod</code></pre><p>Using a NodeJS environment variable I prepended the desired remote location to the standard command that runs the Laravel Mix production compilation. The result for the compiled imported files will be as such:</p><pre><code>https://remote-path.com/build/js/12.js</code></pre><p>Now all of the dynamic components can be found by our application regardless of where the components are being hosted.</p><p><strong>NOTE:</strong> Laravel Mix also supports the ability to read variables from your <code>.env</code> file, so if you feel more comfortable to store this variable there, you may do so and simply remove the environment variable from your build command.</p><h2 id="things-to-keep-in-mind">Things to keep in mind</h2><p>Obviously what I have shown above is how I implemented this within my own production applications. You may store your Javascript and/or CSS in different locations or compile your assets different due to various automated processes; so you will have to update the paths or configurations accordingly. It is also worth mentioning again: <strong>this process is only necessary if you are using something like an S3 bucket to store your files</strong>. If you utilize an environment like a Laravel Forge setup the default setup will work just fine.</p><p>Cheers!</p>]]></content:encoded></item><item><title><![CDATA[Adding documentation to your Laravel Nova installations]]></title><description><![CDATA[Documentation (especially good documentation) for an application can make a good application and experience for a client, a great application. Once we as developers complete an application that we have worked and potentially even lost sleep over, will live on well after we stop working on it.]]></description><link>https://blog.dniccumdesign.com/adding-documentation-to-your-laravel-nova-installations/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4be</guid><category><![CDATA[laravel]]></category><category><![CDATA[nova]]></category><category><![CDATA[tutorial]]></category><category><![CDATA[php]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Wed, 13 Feb 2019 17:29:14 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2019/02/denise-jans-1195969-unsplash.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.dniccumdesign.com/content/images/2019/02/denise-jans-1195969-unsplash.jpg" alt="Adding documentation to your Laravel Nova installations"><p>Documentation (especially good documentation) for an application can make a good application and experience for a client, a <em>great</em> application. Once we as developers complete an application that we have worked and potentially even lost sleep over, will live on well after we stop working on it. Despite our best efforts, clients and customers tend to forget things that we show them on how to manage their application. That's where documentation comes in.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://images.unsplash.com/photo-1509239129736-8c395375220d?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=1080&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" class="kg-image" alt="Adding documentation to your Laravel Nova installations"><figcaption>Photo by <a href="https://unsplash.com/@jamietempleton?utm_source=ghost&utm_medium=referral&utm_campaign=api-credit">Jamie Templeton</a> / <a href="https://unsplash.com/?utm_source=ghost&utm_medium=referral&utm_campaign=api-credit">Unsplash</a></figcaption></figure><p>I'll come right out and say it: I hate writting documentation; regardless for what it is for. My normal avenue's of implementation documention are via Word document, or even a repository wiki. Recently I even used this <a href="https://larecipe.binarytorch.com.my/docs/1.3/overview">awesome documentation package called LaRecipe</a>, that I thorougly enjoyed using. However these solutions didn't solve my pain point of the documentation isn't all in the same place. It required a client to open up another application or web browser and comb through a few different pages. It was these issues that was the inspiration behind a new Nova tool that I built to solve.</p><h2 id="building-yet-another-tool">Building yet another tool</h2><p>So like I said, to solve my own pain, I built a <a href="https://github.com/dniccum/nova-documentation">new Laravel Nova tool called Nova Documentation</a> (creative name right?).</p><p>What this tool essentially does is convert Markdown files and then passes them to the Vue component within the tool. Ultimately it is that simple, however there are a few other things that I added that I think will be useful:</p><ul><li>Syntax highlighting to code blocks (thanks to <a href="https://highlightjs.org/">highlight.js</a>)</li><li>Dynamic navigation/table of contents</li><li>Document nesting with url rewriting</li><li>Flexible configuration settings</li></ul><h2 id="installation">Installation</h2><p>To start, lets install the Nova tool with:</p><pre><code>composer require dniccum/nova-documentation</code></pre><p>Next, we need to publish the tool's configuration file and a couple markdown files with examples to get you started:</p><pre><code>php artisan vendor:publish --provider="Dniccum\NovaDocumentation\ToolServiceProvider"
</code></pre><p>By default, the sample markdown files that the tool uses to render it's content will be added to <code>resources/documentation</code> path. If you would like to move this directory, be sure to update this setting within the <code>config/novadocumentation.php</code> configuration file:</p><pre><code>/*
|--------------------------------------------------------------------------
| Home Page
|--------------------------------------------------------------------------
|
| The markdown document that will be used as the home page and/or entry
| point. This will be located within the documentation directory that resides
| within your application's resources directory.
|
*/

'home' =&gt; 'documentation/home.md',</code></pre><p>And finally, just like every other Nova tool, we need to register the tool within the NovaServiceProvider:</p><pre><code>use Dniccum\NovaDocumentation\NovaDocumentation;

...

/**
 * Get the tools that should be listed in the Nova sidebar.
 *
 * @return array
 */
public function tools()
{
    return [
        // other tools
        new NovaDocumentation,
    ];
}</code></pre><p>Assuming that you have done everything correctly, you should see a "Documentation" link in your Nova sidebar with this as a homepage:</p><figure class="kg-card kg-image-card kg-width-wide kg-card-hascaption"><img src="https://blog.dniccumdesign.com/content/images/2019/02/screenshot-1.png" class="kg-image" alt="Adding documentation to your Laravel Nova installations"><figcaption>Documentation tool home page</figcaption></figure><h2 id="using-the-tool">Using the tool</h2><p>Assuming that you are using the default installation configuration, the directory where all of the markdown file will be stored in the <code>resources/documentation</code> directory. Again, if you use the default settings, the <code>home.md</code> file that will be found there will represent the home page:</p><pre><code># Welcome to your documentation

Within this area, you can:

* Define features
* Add how-to's
* Link to tutorial videos/images

To see any other Markdown syntax and examples, please see [the sample](sample.md).</code></pre><h3 id="generating-your-content">Generating your content</h3><p>A few things you should note:</p><ul><li>Any and all markdown is allowed in these markdown files.</li><li>You should be able to add any other HTML that you want to render within the page; excluding any Javascript as I am using Vue.js template compiler to build the page and any Javascript-based content probably won't render correctly.</li><li>I did add the necessary CSS to clean up some of the appearance of standard HTML tags like ordered/un-ordered lists, headers, block quotes, etc from the base stylesheet that Taylor Otwell constructed for the Nova dashboard. If you happen to find something that isn't quite styled correctly, drop me a line on the <a href="https://github.com/dniccum/nova-documentation/issues">issues page on Github</a>. </li><li>Feel free to organize your markdown files as you wish; whether everything is in the base directory or if you would like to add files within directories. The tool will detect them and build their url's accordingly.</li></ul><h3 id="setting-your-titles">Setting your titles</h3><p>It is important to note that the H1 tags within each of these files will be dynamically pulled to build the sidebar navigation. With that said, make sure your titles are descriptive yet short in length to prevent the sidebar from becoming cluttered.</p><h2 id="other-features">Other features?</h2><p>As you continue to work with this tool, let me know if there are any other features that you would like to add. Thank you for your interest!</p>]]></content:encoded></item><item><title><![CDATA[Customizing your Laravel Broadcasting payloads]]></title><description><![CDATA[<p>One piece of the Laravel documentation that is somewhat <em>glanced</em> over in my opinion is the ability to customize the payload that is sent out to your Laravel Echo listeners upon new events or <code>broadcast</code> methods. This can be extremely important when results from the models passed to these events</p>]]></description><link>https://blog.dniccumdesign.com/customizing-your-laravel-broadcasting-payloads/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4bd</guid><category><![CDATA[laravel]]></category><category><![CDATA[pusher]]></category><category><![CDATA[php]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Wed, 30 Jan 2019 15:00:00 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2019/01/containers.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://blog.dniccumdesign.com/content/images/2019/01/containers.jpg" alt="Customizing your Laravel Broadcasting payloads"><p>One piece of the Laravel documentation that is somewhat <em>glanced</em> over in my opinion is the ability to customize the payload that is sent out to your Laravel Echo listeners upon new events or <code>broadcast</code> methods. This can be extremely important when results from the models passed to these events are large and/or bloated; for example: when you are storing chunks of HTML in your database.</p><p>By default, the Broadcasting functionality within Laravel serializes any and all classes/models that you pass to the events. What does this mean? Well lets say for example that you have a model that has 15 columns for each result, however during your select method you only query 3 columns; like so:</p><pre><code>$user = \App\User::where('id', 6)
	-&gt;select(['id', 'name', 'email'])
	-&gt;first();

event(new \App\Events\UserLoggedOut($user));</code></pre><p> If you pass this model to the event all 15 columns will be queried; regardless of what you select, or the number of <code>unset</code> methods you use. If you are using a service like that of <a href="https://pusher.com/">Pusher</a>, you could run into character limits that will ultimately lead to errors and failed messages.</p><h2 id="modifying-your-broadcasts">Modifying your broadcasts</h2><p>In order to prevent our broadcasts from erroring, we need to customize the payload that is sent to Pusher or any other platform. We can do that by playing a <code>broadcastWith</code> method within our event class. Let's to back to the example above, where we only want to broadcast the id, name, and email address. Our broadcastWith method would look like this:</p><!--kg-card-begin: markdown--><pre><code class="language-php">/**
 * Specifically broadcasts the necessary array keys
 *
 * @return array
 */
public function broadcastWith()
{
    return [
        'user' =&gt; [
            'id' =&gt; $this-&gt;user-&gt;id,
            'name' =&gt; $this-&gt;user-&gt;name,
            'email' =&gt; $this-&gt;user-&gt;email,
        ]
    ];
}
</code></pre>
<!--kg-card-end: markdown--><p>By implementing this method, your broadcasts will be significantly simplified and thus carry a light payload. But that's not all! Because this is a method, and not a variable, you can perform additional logic within this method to conditionally add content to the payload. Refer to another example below:</p><!--kg-card-begin: markdown--><pre><code class="language-php">/**
 * Specifically broadcasts the necessary array keys
 *
 * @return array
 */
public function broadcastWith()
{
    $payload = [
        'configuration' =&gt; [
            'id' =&gt; $this-&gt;configuration-&gt;id,
            'name' =&gt; $this-&gt;configuration-&gt;name,
            'user' =&gt; [
                'first_name' =&gt; $this-&gt;configuration-&gt;user-&gt;first_name,
                'last_name' =&gt; $this-&gt;configuration-&gt;user-&gt;last_name,
            ],
        ]
    ];

    if (!empty($this-&gt;configuration-&gt;jurisdictions)) {
        $jurisdiction = $this-&gt;configuration-&gt;jurisdictions-&gt;first();

        $payload['jurisdiction'] = [
            'id' =&gt; $jurisdiction-&gt;id
        ];
    }

    if (!empty($this-&gt;configuration-&gt;comments)) {
        $payload['comments'] = $this-&gt;configuration-&gt;comments;
    }

    return $payload;
}
</code></pre>
<!--kg-card-end: markdown--><p>What I have done is that I have assigned the initial associative array to a variable and performed some basic if-statements to conditionally add content to the payload.</p><p>While this isn't necessarily rocket science, using this method of customizing your broadcasts can simplify your application and save you a bunch of time wrestling with the Pusher/websockets APIs.</p>]]></content:encoded></item><item><title><![CDATA[Automate your Buddy pipelines with a custom AWS CLI implementation]]></title><description><![CDATA[Set up the Amazon Web Services CLI to automatically upload assets within Buddy.]]></description><link>https://blog.dniccumdesign.com/adding-custom-s3-support-to-buddy-pipeline/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4bc</guid><category><![CDATA[deploy]]></category><category><![CDATA[buddy]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Sun, 27 Jan 2019 20:45:17 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2019/01/amazon-s3.png" medium="image"/><content:encoded><![CDATA[<img src="https://blog.dniccumdesign.com/content/images/2019/01/amazon-s3.png" alt="Automate your Buddy pipelines with a custom AWS CLI implementation"><p>Recently I updated a Laravel 5.7 project to use the newest version of Jeffery Way's compilation platform Laravel Mix. This update includes Webkit 4 support, faster installations and compilation processes, <a href="https://laravel-news.com/laravel-mix-4-released">and a bunch of other goodies</a>. Unfortunately this update introduced a wrench into an automated deployment process that I had in place.</p><h2 id="my-environment">My environment</h2><h3 id="hosting">Hosting</h3><p>I use <a href="https://www.fortrabbit.com/">fortrabbit</a> for a lot of my production applications because of their managed Laravel hosting that is relatively inexpensive compared to other offerings, and yet it is extremely reliable. For asset storage, fortrabbit employs a custom S3 bucket implementation. When I say custom, all of the normal S3 API protocols apply to this system however the standard Amazon Web Services authentication will not work because, well, it's not Amazon; <em>duh.</em></p><p>While it is a great implementation, it is worth keeping in mind that a lot of the standard out of the box plugins for S3 uploads all assume you are uploading to AWS, which is not compatible with fortrabbit's S3 implementation.</p><h3 id="automation">Automation</h3><p>To automate my deployment processes,<a href="https://buddy.works?fp_ref=s87n7"> I use this great platform called Buddy</a>. Buddy is like that of Jenkins, Bamboo, and any of these other automation platforms; it allows you to perform dev ops functions with simply a click-of-a-button or a repository action (push, merge, etc). The exceptions are it offers an interface that much more user friendly then that most other options, and it provides pre-configured plugins that allow you to jump start your deployment processes quicker than that of Jenkins of Bamboo.</p><p>What's more, Buddy is different because Buddy uses Docker containers to run anything from Python, NodeJS, RSYNC or even Haskell on your repository's code. You can even attach MariaDB, MongoDB, or MySQL databases to these Docker containers to run unit testing.</p><h2 id="my-issue">My issue</h2><p>I had been using this <a href="https://github.com/MikaAK/s3-plugin-webpack">Webpack plug-in</a> to automatically upload all of my compiled Javascript and CSS upon the completion of the development/production deployment. When upgrading to Mix 4, the updated S3 Webpack plugin uploads all of the resulting files in a multi-part upload, which fortrabbit does not support. So as you would expect my automated deployments began producing errors.</p><h2 id="solution">Solution</h2><p>Below is a screenshot of my Buddy pipeline for my production deployment:</p><figure class="kg-card kg-image-card"><img src="https://blog.dniccumdesign.com/content/images/2019/01/Screen-Shot-2019-01-27-at-1.37.44-PM.png" class="kg-image" alt="Automate your Buddy pipelines with a custom AWS CLI implementation"></figure><p>Conventional wisdom would suggest that I just utilize Buddy's default S3 upload plugin. While you would be right in most cases, but as  I said earlier, most S3 upload solutions will not work with fortrabbit because you need to be able to provide custom url endpoints and custom authentication protocols.</p><h3 id="configuration">Configuration</h3><p>In the screenshot above, you can see that instead of an AWS-branded solution, I have decided to use stock Linux Python v3.5 Docker container with the following settings:</p><figure class="kg-card kg-image-card"><img src="https://blog.dniccumdesign.com/content/images/2019/01/Screen-Shot-2019-01-27-at-2.00.21-PM.png" class="kg-image" alt="Automate your Buddy pipelines with a custom AWS CLI implementation"></figure><p>Using PIP, I installed <a href="https://aws.amazon.com/cli/">the official AWS CLI</a> and then configured it using the S3 credentials that were supplied to me by fortrabbit; hence the various <code>aws configure set</code> commands. Here are a few other things to pay attention to:</p><ul><li>The fourth configuration setting (<code>aws configure set output text</code>) simply tells AWS that I want all of the results to be outputted to text; which is perfect for simple automation</li><li>The fifth and sixth configuration settings tell the CLI that I want to use the new V4 authentication protocol. <strong>This is important</strong>. If you don't use these settings, you will get an error saying that the signature is invalid.</li></ul><h3 id="run-the-configuration">Run the configuration</h3><p>After the container has been configured, now I can run the command that will upload the necessary assets to S3:</p><pre><code>aws s3 cp public/build/ s3://gm-production/build/ --recursive --exclude ".DS_Store"  --endpoint-url=https://objects.us1.frbit.com</code></pre><p>So lets break this down:</p><ul><li>Skipping the <code>aws s3</code> portion, as that should be pretty self-explanatory, the next portion should look pretty familiar to those who use BASH commands: <code>cp public/build/ s3://gm-production/build/</code>.  Essentially you are copying a directory to a url using the S3 protocol.</li><li><code>--recursive</code> tells the CLI that you want to copy the directory in its entirety; including any additional sub directories.</li><li>You can tell AWS if there are any files/directories that you don't want to copy. You can do this using a regular expression. In my case: <code>--exclude ".DS_Store"</code> I am telling the CLI that I don't want to copy those pesky DS_Store files that the MacOS generates.</li><li>Finally, and possibly the most important, is the endpoint url for the bucket: <code>--endpoint-url=https://objects.us1.frbit.com</code>. With custom S3 buckets this url has to be added to the request, or the CLI will think that you are going to be uploading to an AWS based bucket.</li></ul><h3 id="uploading-your-files">Uploading your files</h3><p>Assuming that you have everything hooked up correctly, your upload process that is returned by Buddy should look something like this:</p><pre><code>aws s3 cp public/build/ s3://gm-production/build/ --recursive --exclude .DS_Store --endpoint-url=https://objects.us1.frbit.com
Completed 256.0 KiB/6.2 MiB (1.3 MiB/s) with 3 file(s) remaining
Completed 512.0 KiB/6.2 MiB (2.2 MiB/s) with 3 file(s) remaining
Completed 768.0 KiB/6.2 MiB (3.3 MiB/s) with 3 file(s) remaining
Completed 1.0 MiB/6.2 MiB (4.4 MiB/s) with 3 file(s) remaining  
Completed 1.2 MiB/6.2 MiB (5.2 MiB/s) with 3 file(s) remaining  
Completed 1.5 MiB/6.2 MiB (5.9 MiB/s) with 3 file(s) remaining  
Completed 1.8 MiB/6.2 MiB (6.8 MiB/s) with 3 file(s) remaining  
Completed 2.0 MiB/6.2 MiB (7.8 MiB/s) with 3 file(s) remaining  
Completed 2.0 MiB/6.2 MiB (7.5 MiB/s) with 3 file(s) remaining  
upload: public/build/css/vendor.css to s3://gm-testing/build/css/vendor.css
Completed 2.0 MiB/6.2 MiB (7.5 MiB/s) with 2 file(s) remaining
Completed 2.3 MiB/6.2 MiB (7.8 MiB/s) with 2 file(s) remaining
Completed 2.5 MiB/6.2 MiB (8.2 MiB/s) with 2 file(s) remaining
Completed 2.8 MiB/6.2 MiB (8.6 MiB/s) with 2 file(s) remaining
Completed 3.0 MiB/6.2 MiB (8.3 MiB/s) with 2 file(s) remaining
Completed 3.3 MiB/6.2 MiB (8.9 MiB/s) with 2 file(s) remaining
Completed 3.5 MiB/6.2 MiB (9.3 MiB/s) with 2 file(s) remaining
upload: public/build/css/main.css to s3://gm-testing/build/css/main.css
Completed 3.5 MiB/6.2 MiB (9.3 MiB/s) with 1 file(s) remaining
Completed 3.7 MiB/6.2 MiB (9.7 MiB/s) with 1 file(s) remaining
Completed 4.0 MiB/6.2 MiB (9.3 MiB/s) with 1 file(s) remaining
Completed 4.2 MiB/6.2 MiB (9.8 MiB/s) with 1 file(s) remaining
Completed 4.5 MiB/6.2 MiB (9.8 MiB/s) with 1 file(s) remaining
Completed 4.7 MiB/6.2 MiB (10.0 MiB/s) with 1 file(s) remaining
Completed 5.0 MiB/6.2 MiB (9.7 MiB/s) with 1 file(s) remaining 
Completed 5.2 MiB/6.2 MiB (9.6 MiB/s) with 1 file(s) remaining 
Completed 5.5 MiB/6.2 MiB (9.2 MiB/s) with 1 file(s) remaining 
Completed 5.7 MiB/6.2 MiB (8.8 MiB/s) with 1 file(s) remaining 
Completed 6.0 MiB/6.2 MiB (7.9 MiB/s) with 1 file(s) remaining 
Completed 6.2 MiB/6.2 MiB (6.7 MiB/s) with 1 file(s) remaining 
Completed 6.2 MiB/6.2 MiB (2.3 MiB/s) with 1 file(s) remaining 
upload: public/build/js/app.js to s3://gm-testing/build/js/app.js
Build finished successfully!.</code></pre><h2 id="get-buddy">Get Buddy</h2><p>Has this article interested you in Buddy? <a href="https://buddy.works?fp_ref=s87n7">Follow this link to get signed up for free!</a></p>]]></content:encoded></item><item><title><![CDATA[Setting yourself above: speaking at a conference]]></title><description><![CDATA[Public speaking can be petrifying to people who have never done it before. However speaking at conferences has tons benefits for you rand your career.]]></description><link>https://blog.dniccumdesign.com/setting-yourself-above-speaking-at-a-conference/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4bb</guid><category><![CDATA[editorial]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Tue, 11 Apr 2017 20:49:23 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Whenever I tell a colleague that I am speaking at a conference, the common response is usually along the lines of, &quot;you get up in front of people and talk? I could never do that.&quot; The fact that I am here in the flesh writing to you now proves that this daunting task is easier than many think. What's more is the fact that it's not just about doing something intimidating or challenging, it's all about sharing that little bit of knowledge that you have with someone to make them better.</p>
<p>It has been said that when you attend a conference (technical or creative) the speakers at that conference represent the top 5% of minds in your respective fields. Essentially the cream of the crop. While the attendees can very easily benefit from this type of talent and knowledge, speaking at a conference can benefit you just as much. Surrounding yourself with this type of knowledge is in infectious; which can only improve you as a person, personally and professionally, and as a subject matter expert.</p>
<p>So you may be saying to yourself, &quot;Ok so I can get smarter and/or more knowledgable. Big deal. Is that all?&quot; Well no of course not. Keep reading.</p>
<h2 id="investinginyourskill">Investing in your skill</h2>
<p>I can speak from experience that there is nothing more embarrassing than getting up in front of a group of people and you start receiving questions that you don't know the answers to; especially when you are live-coding. It's awkward for you and the people attending your talk. So how do you avoid this?</p>
<p>It's simple: do your homework.</p>
<p>While you may be pretty comfortable with your subject matter, there is always an opportunity for you to add just a few extra facts to help draw in your audience even more. More often than not, these facts, pointers, or how-to's are items that you probably don't already know. A common requirement for conferences is for you to stage your code on GitHub or any other equivalent with any associated documentation. A bi-product of this requirement is clean code and detailed how-to's; all extremely effective tools who may not be experts. It's these little nuggets that help validate you even more as a speaker.</p>
<h3 id="whatitdidforme">What it did for me</h3>
<p>For example, in preparation for a session that I was going to give on the <a href="http://ionicframework.com">Ionic Mobile Framework</a>, I had to familiarize myself with multiple deployment platforms. I am an Apple/Mac guy: I have multiple iPads, several iPhones, and a couple Apple computers at home. However to think that everyone else works on the exact same devices I do would be naive and frankly, stupid. So I invested in a cheap PC as well as an Android phone so I could broaden my scope of documentation; allowing me to understand the ins and outs of Android Studio and the Java SDK (software developer kit). Boy am I glad I did. I found more than a couple &quot;gotchas&quot; that could have easily derailed my talk.</p>
<p>All of these items require time. This is time that you spend becoming even more familiar with the documentation and processes and thus committing even more information to memory. In the end you solidify yourself within the minds of your attendees as a subject matter expert.</p>
<h2 id="buildingyourselfup">Building yourself up</h2>
<p>If you want to speak publicly, you don't necessarily have to immediately jump into the conference environment. Speaking in front of complete strangers with zero context on who you are or where you come from can be frightening. I get it. Fortunately I can vouch for the fact that there are several other opportunities for you to speak before taking the conference plunge.</p>
<h3 id="clientmeetings">Client meetings</h3>
<p>One opportunity that commonly gets overlooked is client meetings. While you may not be speaking in a traditional workshop session type setting, you are in fact speaking to people who are not familiar with your content; requiring your to explain in detail how you came to a certain conclusion. What's more is you may receive questions on your conclusions forcing you to think quickly on your feet thus placing even more emphasis on your preparation.</p>
<h3 id="focusgroups">Focus groups</h3>
<p>Another type of meeting that is common in the agency setting is a focus group. This can be a meeting that was called between like-minded people to decide on a process for the larger masses, or this could be an open forum during a lunch and learn. Each setting has a different type of audience make up all with unique skillets and knowledge, but each requires you to be prepared and explain your content in detail to ensure no one is left behind.</p>
<h3 id="screencasts">Screencasts</h3>
<p>While not necessarily formal, putting yourself on a YouTube video screencast is another great starting place for you to begin speaking. You don't have the pressure of a live audience, allowing for additional focus on your diction and flow of the presentation. As an added benefit, if your videos get popular you begin to make a little ad revenue as well!</p>
<h2 id="promotionselfandcompany">Promotion - self and company</h2>
<p>Being in the industry that we are in, it could be safe to say that we take the concept of marketing for granted. We consistently market a product or idea to others while not necessarily putting a lot of effort in marketing ourselves or our employer's brand. It's this notion of brand recognition, of either ourselves or our company, that attracts like-minded talent and brand recognition.</p>
<p>People and thus word of mouth are the best way for a message to spread. I'm sure that all of you social media people can agree. This is how viral threads get started, word of mouth, or a social presence; which is in fact an extension of our voice. The more advocates and content that is in the public eye with your brand being connected to only fuels the message.</p>
<p>In my opinion, a common misconception of brand recognition comes associated with award winning work. Design periodicals and other marketing resources continuously push the draw of awards to their readers. Don't get me wrong, awards are great: they create great magazine material and are awe-inspiring when you enter the front doors of the office, seeing all of those trophies on the walls. However these awards can be all but useless if the message of your brand is lost.</p>
<p>Stereotypically when you speak at a conference, you introduce yourself and the company that you work for. At this moment you are instantly bringing your employer's brand to the fore-front of your attendee's minds. Just like a creative TV advertisement, that recognition will be associated with the content that you are about to present; not only making you appear better but your employer as well.</p>
<h2 id="soyouwanttospeakgreatyouhavehelp">So you want to speak? Great you have help.</h2>
<p>While you prepare for your session, one thing to keep in mind while you prepare is to remember that you do not have to go through this process alone.</p>
<p>Your company leadership has a number of professionals that have attended conferences of various sizes that have an enormous wealth of information. While the subject matter may be different, that doesn't matter. Having a fresh perspective on your content can only help you during the preparation and feedback stages.</p>
<p>Refer to the section above about the various settings that you can utilize to help you get your feet wet in speaking publicly. I understand that this is not an easy task for everyone. I can place myself in that category at one point in time. However just like everything else, practice makes perfect. Keep reviewing your notes and always be open to feedback; good and bad.</p>
<p>At the end of the day, the people who attend your talks want you to succeed. No one attends your sessions looking for you to fail. Determination and hard work yields quality for your talk and for you as a presenter.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Improving your site using retina-optimized images with RetinaJS]]></title><description><![CDATA[The quality of the images within a website or application can either make or a break a project. Using RetinaJS, you can provide high fidelity photography to your users.]]></description><link>https://blog.dniccumdesign.com/improving-your-site-using-retina-optimized-images-with-retinajs/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4ba</guid><category><![CDATA[javascript]]></category><category><![CDATA[case study]]></category><category><![CDATA[frontend]]></category><category><![CDATA[tutorial]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Tue, 07 Mar 2017 14:30:00 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2017/03/jan-erik-waider-147107-web.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://blog.dniccumdesign.com/content/images/2017/03/jan-erik-waider-147107-web.jpg" alt="Improving your site using retina-optimized images with RetinaJS"><p>Now several years into my professional web development career, I can say that while the quality of your written code will define who you are as a professional among your peers, the &quot;everyday joes&quot; that come to your site are not going to care if your code is indented correctly or if you use the latest DRY implementations. What the typical user cares about is design and user experience; does your website look pretty and ultimately fulfill their needs as a user. If the answer to this question does not return a resounding &quot;yes,&quot; then your website or application is effectively broken to these individuals.</p>
<p>I think we can all agree that user experience within a site or application is the absolute paramount item that we as interactive designers should be concerned about. But for the sake of this article, that topic lies beyond our scope of conversation. We are here to talk about the aesthetics that make a site attractive and pleasing to the eye.</p>
<h2 id="usingimagestoyouradvantageordisadvantage">Using images to your advantage or disadvantage</h2>
<p>As we progress through the development of a new site or application, what can easily set a site above another is the quality and composition of the site's photography. An environment that is designed somewhat poorly can overcome enormous short-comings if the site incorporates quality photographs. Cost aside (since we all know this isn't cheap), this is the quickest and easiest way to improve a site.</p>
<p>Excellent photos don't just lend themselves to composition, lighting, color, etc. Showing these images in the incorrect size and/or resolution can quickly render the image useless if the photo appears grainy or blurred. This is why a team can greatly benefit from close communication between the developers and creatives making sure the correct images are being utilized in the best way possible. Even an uneducated user can identify an image that looks out of place. This is why it is our responsibility as creative professionals to relieve the user of these burdens.</p>
<h2 id="utilizingthetechnologyevolutiontoourbenefit">Utilizing the technology evolution to our benefit</h2>
<p>If you are designer or front end web developer, I'm sure you are familiar with the <em>72 dpi doctrine</em> that was drilled into us at school. The instructors preached till they were blue in the face that 72 dpi (dots per square inch) was the best way to render an image without sacrificing quality or load speed. Any other way was useless and over-kill.</p>
<p>Fortunately this is no longer the case.</p>
<p>Up until the last few years, all computer monitors and mobile device screens did not have the ability render high quality images simply because they did not have the pixel density to show anything above 72 dpi. So it was essentially pointless to use anything above 72 dpi within a site; especially with the inherited file size increases as with its respective page load speed.</p>
<p>Thanks to the folks over at Apple for their retina screens and Samsung for their AMOLED screens, we now have the technology to incorporate high density monitors and screens on our devices; allowing us to reap the benefits of high resolution images and animation frame rates above 24 frames per second. All of these new advancements provide a richer experience for all users alike.</p>
<h2 id="retinaimagessimplyput">Retina images simply put</h2>
<p>So now we can support these high density images, but how do we show them? If you regard <a href="https://developer.apple.com/library/content/documentation/2DDrawing/Conceptual/DrawingPrintingiOS/SupportingHiResScreensInViews/SupportingHiResScreensInViews.html#//apple_ref/doc/uid/TP40010156-CH15-SW1">Apple's documentation</a> on how to automatically swap out these images, we can control which images appear with simple filename suffixes such as @2x or @3x. That's it! The browsers and operating system will take care of the rest.</p>
<p><em>It is worth noting that @3x was recently introduced thanks to the new iPhone 6/7+ and iPad Pro product lines.</em></p>
<p>So lets say for a second that we have a image named <code>main-logo.png</code>. The browser/application would show this image as is, regardless of the device that we were using to view this image. Now if we were to include another image, one that is <strong>exactly</strong> twice the dimensions of the original, within the same location our <code>main-logo.png</code> was located and it was named <code>main-logo@2x.png</code>, the browser/application would know to render this image if a high density screen was being utilized.</p>
<h2 id="whyuseajavascriptpluginlikeretinajs">Why use a Javascript plugin like RetinaJS</h2>
<p>So I'm sure you could be asking yourself at this point, if the browser or device is so smart to automatically swap out the images, why do we have to include a third-party plugin like <a href="http://imulus.github.io/retinajs/">RetinaJS</a>? Well the answer is somewhat complicated so I've outlined a few points to simplify it for you:</p>
<ol>
<li><strong>Browser Support</strong> - While your computer or phone may support this functionality natively, not all browsers will perform these image swaps for you. This plugin will fire after the page's initial load, quickly combing through the page to swap out the appropriately tagged images to their high resolution counterparts.</li>
<li><strong>Responsive Design</strong> - Assuming that anything that we will be building will need to be responsive, all images incorporated into our pages need the ability to grow and/or shrink freely based on the screen size the page is viewed on. This posses a problem when you are effectively doubling or tripling the size of the image after page load. Our Javascript plugin will set a height and width attribute of the original image before swapping its source to prevent any unnecessary expansion.</li>
<li><strong>Page Load</strong> - I think we can all agree that when you increase the size of an image, whether its the density or the dimensions, the file size of the image grows with it. It's simple math. I think that we can also agree that if you are cumulatively increasing all of the images across your page and site, the site will not load as fast. Utilizing such a plugin that fires after page load will help spread the load time over a longer period. This means your site will load as it normally would initially, then swapping out the high resolution images once the page is ready for them.</li>
</ol>
<h2 id="howtouseretinajs">How to use RetinaJS</h2>
<p>Using the plugin is ultimately very easy; regardless if you are building an application or CMS-powered website. Whats more is that you have a few different options available to you to call RetinaJS to swap out your images.</p>
<h3 id="includetheplugin">Include the plugin</h3>
<p>First things first, you include the plugin within your technology stack via minification, or simply by direct injection:</p>
<pre><code>&lt;script type=&quot;text/javascript&quot; src=&quot;/scripts/retina.js&quot;&gt;&lt;/script&gt; 
</code></pre>
<h3 id="option1preprocessedstylesheet">Option 1 - Pre-processed stylesheet</h3>
<p>I will make the assumption that you will be using something like SASS/SCSS or LESS to quickly scaffold your sites stylesheets. Thankfully RetinaJS comes with some mixins to be able to automatically swap out your background images, like so:</p>
<pre><code>#logo {
  .at2x('/images/my_image.png', 200px, 100px);
}
</code></pre>
<p>This would then compile to something like this:</p>
<pre><code>#logo {
  background-image: url('/images/my_image.png');
}

@media all and (-webkit-min-device-pixel-ratio: 1.5) {
  #logo {
    background-image: url('/images/my_image@2x.png');
    background-size: 200px 100px;
  }
}
</code></pre>
<p>The media query beneath the logo ID is the key. Basically its looking for any kind of media, regardless of its type, then looks at the media density and then applies addition CSS to the classification. Also pay special attention to the background size. This locks in the image size and will prevent the image from growing unnecessarily like I referenced to above; thus increasing the pixel density as a result.</p>
<h3 id="option2imageelementdataattributes">Option 2 - Image/Element data attributes</h3>
<p><strong>If you are building a website on a CMS, this is probably the best solution for you.</strong></p>
<p>The other option you have available to you is simply incorporating an HTML data attribute to every HTML element (div, section, etc) or image tag, telling RetinaJS to look for the high resolution variant of the defined image like so:</p>
<pre><code>&lt;img src=&quot;/images/my_image.png&quot; data-rjs=&quot;2&quot; /&gt;
</code></pre>
<p>This will then be re-rendered like so after the plugin is fired:</p>
<pre><code>&lt;img src=&quot;/images/my_image@2x.png&quot; data-rjs=&quot;2&quot; data-rjs-processed=&quot;true&quot; width=&quot;200&quot; height=&quot;120&quot;&gt;
</code></pre>
<p>The height and width attributes were defined to restrict the image from growing and the source was swapped out. An added benefit is this technique will also work for an HTML element that has a background image:</p>
<pre><code>&lt;div style=&quot;background-image: url(/images/my_image.png)&quot; data-rjs=&quot;2&quot;&gt;&lt;/div&gt;
</code></pre>
<p>This will then be re-rendered like so after the plugin is fired:</p>
<pre><code>&lt;div style=&quot;background-image: url(/images/my_image@2x.png); background-size: 200px 100px&quot; data-rjs=&quot;2&quot; data-rjs-processed=&quot;true&quot;&gt;&lt;/div&gt;
</code></pre>
<h2 id="experiencingtheresultsforyourself">Experiencing the results for yourself</h2>
<p>While I can stand on my soap box and preach the benefits of utilizing a method such as this, realistically the only way you can see the benefit is to actually &quot;see&quot; it.</p>
<p>I was recently part of a team that launched a completely redesigned website <a href="https://www.bridgestonegolf.com/en-us/index">BridgestoneGolf.com</a> that utilizes the process outlined above. I can honestly say that the increased fidelity and quality of the product images within the site is absolutely stunning. And again, it is something that I cannot explain, you have to visually see it for yourself.</p>
<p>Hope this helps you make a decision on whether to include something like this website. I have had phenomenal success personally and I cannot say enough on the improved user experience it provides to a website.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Prairie.Code() 2016 Round-Up]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>For those of you who were not aware, both myself and another VML employee Heather Downing had the privilege to speak at the Prairie.Code() 2016 conference in Des Moines, Iowa at the end of this past October. While this conference enjoyed it's first year in existence, this did not</p>]]></description><link>https://blog.dniccumdesign.com/prairie-code-2016-round-up/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4b9</guid><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Mon, 14 Nov 2016 16:29:12 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>For those of you who were not aware, both myself and another VML employee Heather Downing had the privilege to speak at the Prairie.Code() 2016 conference in Des Moines, Iowa at the end of this past October. While this conference enjoyed it's first year in existence, this did not diminish from the quality of speakers and content that was presented within it's sessions. I would like to extend my commendation to each and every individual who contributed to putting on this conference.</p>
<p>Below you will find a quick summary of the talks that I conducted as well as all of the relevant information/code samples that were accompanied with each. Enjoy.</p>
<h2 id="buildamobileapplicationlikeawebdeveloperusingionic">Build a mobile application like a web developer using Ionic</h2>
<p>My first session was a 4-hour workshop that covered the Ionic mobile platform. The goal at the end of this session was to educate each participant to the point that they could take the knowledge and skill attained during this session and build their own application.</p>
<h3 id="summary">Summary</h3>
<p><a href="http://ionicframework.com">Ionic</a> is a hybrid mobile application framework. The term &quot;hybrid&quot; implies the fact that the applications built in Ionic create a digital handshake using web technologies (HTML, Javascript, CSS, etc) and native languages (Objective C, Java, etc). Essentially you build a single page web application that is hosted within a native wrapper. So you build the application like a web developer would, but export the application using XCode or Android Studio like that a mobile developer; all while writing one set of code and tapping into the benefits of each operating system.</p>
<p>The subject mater of my talk consisted of a similar introduction to the Ionic platform like that of the previous paragraph. From there I escorted each participant through the development environment set up process. Once the development was in place, we began building a simple application called &quot;Who Paid Last?&quot; During the development process of this application, each participant would have learned the following:</p>
<ul>
<li>A basic understanding of Angular 2, the Typescript syntax, and how <a href="http://ionicframework.com/docs/v2/">Ionic 2</a> uses these languages to it's advantage.</li>
<li>What an Apache-Cordova plugin is, and how you can utilize the power of these plugins to build applications with native behavior.</li>
<li>Building custom touch gestures, and how you can implement them within your interface.</li>
<li>Adding analytic support with the help of Mixpanel.</li>
<li>Finally build an application that is ready for a production release.</li>
</ul>
<h3 id="complications">Complications</h3>
<p>A major issue that I encountered within this workshop was the time that it took to debug each person's environment setup. While I felt extremely prepared during the planning process, things like IT security and preferential user setups that yielded errors were items that I did not account for. The time required to move throughout the room to help each person with their environment took an excess of two hours. This proved problematic as I had prepared <strong>four hours of content</strong>. As you would expect, I was then forced to move quickly through the content, skipping some topics, and not being able to take questions to effectively speak to the content that I was preaching. Ultimately this left the project unfinished at the end of the workshop. In effort to provide additional value to the participants, I finished the project through <a href="https://www.youtube.com/watch?v=m6pZ19N8H6o&amp;list=PL5TyhAOOfI8HQFdDy2VN3kqtN7rA2FDLO">a series of screencasts</a>.</p>
<p>I have been asked by several people if I will give this talk again, and I can honestly say that I don't know. I have a few ideas on what I could do differently, but in the long run, I cannot avoid the environmental technology issues that I encountered.</p>
<h3 id="codeandcontent">Code and Content</h3>
<p>Below you find all of the code and content that was utilized for this session:</p>
<ul>
<li><a href="https://github.com/dniccum/Intro-To-Ionic-App">Code repository</a></li>
<li><a href="https://drive.google.com/open?id=1kG5zCVV8A4VKGKkM3EUN208Rq1W0n2_pK6_LBtccTR0">Slide deck</a></li>
<li><a href="https://www.youtube.com/watch?v=m6pZ19N8H6o&amp;list=PL5TyhAOOfI8HQFdDy2VN3kqtN7rA2FDLO">Videos</a></li>
<li><a href="https://blog.dniccumdesign.com/ionic-resources/">Additional resources</a></li>
</ul>
<h2 id="buildingapisquicklyandefficientlyusingsailsjs">Building API's quickly and efficiently using SailsJS</h2>
<p>The second session that I conducted was more of an educational session, versus the previous workshop-style session. This session touched on the <a href="https://nodejs.org/en/">NodeJS</a> framework <a href="http://sailsjs.org/">SailsJS</a>, and I spoke on how you can use this lightweight yet extensible platform to quickly scaffold full-featured web applications and REST API's.</p>
<h3 id="summary">Summary</h3>
<p>The NodeJS website defines it as &quot;Node.js is a platform built on Chrome's JavaScript runtime for easily building fast and scalable network applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices.&quot; The power that NodeJS provides to its users is that you can take Javascript, the life blood of every front-end developers skill set, and write backend applications with it. SailsJS enhances this power by building it's platform on top of <a href="http://expressjs.com/">Express.js</a>, which is probably the most popular NodeJS frameworks, and extends it with tools and features.</p>
<p>As I began to speak to these features, I was able to demonstrate these valuable tools by building a REST API in <strong>15 seconds</strong> with a few Shell commands on my computer. From there, I began to speak to how each of the major features of SailsJS can be used in your applications:</p>
<ul>
<li>SailJS utilizes Grunt for all of it's front-end asset management. This allows for endless extensibility to fit your developments needs; like adding SASS support which was done during my talk.</li>
<li>SailsJS uses the <a href="https://github.com/balderdashy/waterline">Waterline ORM</a> and <a href="http://sailsjs.org/documentation/reference/blueprint-api">blueprints</a> for developers to tap into a stereotypical MVC (model view controller) architecture without having to build each method or route by hand. Thus custom REST API endpoints can be created with a few Shell commands.</li>
<li>Logic called <a href="https://blog.dniccumdesign.com/prairie-code-2016-round-up/http%22sailsjs.org/documentation/concepts/policies">policies</a> can be implemented to protect specific actions/routes/content from users who may not be authorized to do so.</li>
<li>Because it is a NodeJS framework, any NodeJS module or <a href="http://node-machine.org/">Node Machine</a> can be injected to extend the base functionality.</li>
</ul>
<h3 id="codeandcontent">Code and Content</h3>
<p>Below you find all of the code and content that utilized for this session:</p>
<ul>
<li><a href="https://github.com/dniccum/sailsjs-phone-book">Code repository</a></li>
<li><a href="https://drive.google.com/open?id=11EzQ8Y8DUQw4PDXX3Xjf0Wxq7rVIU5YiCHt7TtZK-ec">Slide deck</a></li>
</ul>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Achieving search within AEM the Bridgestone way - a case study]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>Having a website or web application with a search feature is nearly a requirement now-a-days for the optimum user experience that we are all trying achieve/want. Why spend all that time looking for that one item through layers upon layers of navigation and menus when you can simply type</p>]]></description><link>https://blog.dniccumdesign.com/achieving-search-within-aem-the-bridgestone-way-a-case-study/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4b8</guid><category><![CDATA[search]]></category><category><![CDATA[case study]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Fri, 21 Oct 2016 15:14:44 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2016/10/bridgestone-search.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://blog.dniccumdesign.com/content/images/2016/10/bridgestone-search.png" alt="Achieving search within AEM the Bridgestone way - a case study"><p>Having a website or web application with a search feature is nearly a requirement now-a-days for the optimum user experience that we are all trying achieve/want. Why spend all that time looking for that one item through layers upon layers of navigation and menus when you can simply type it or even speak it using your mobile device? As rudimentary as a search feature may appear to the normal &quot;joe,&quot; the decisions made during the experience, design, and development processes have a drastic effect on your website. Sometimes these decisions can be made on the simple question of: do you want your search fast, or extensive and intelligent? Either way, each option can supply their own list pros and cons that either the developer or the user will have to endure.</p>
<h2 id="searchingwithinaem">Searching within AEM</h2>
<p>When building a search function within AEM, extra considerations need to be taken into account in regards to the application's dispatcher. Do we want to by-pass the dispatcher and risk the extra load on the application server to completely load the entire view over and over during the search and filtering process? Or do we want to risk some potential in-page performance issues if we load everything into the page via API service, create a complicated Javascript application to digest the service's content, and even sacrifice SEO for the page?</p>
<p>The Bridgestone development team at VML Kansas City has implemented multiple solutions to the &quot;search problem&quot; with mixed results in all facets. As I continue to explain what we did to execute these tasks, here's one thing to keep in mind as you read on: each website had a different audience as well as content types/amounts to consider.</p>
<h2 id="partyinthebackwithfirestonecommercial">Party in the back with Firestone Commercial</h2>
<p>The <a href="http://commercial.firestone.com">Firestone Commercial website</a> was a gigantic undertaking for the entire team involved and I'm sure these people can not emphasize this enough. The task of effectively combining three separate business units, or BU's, into a single cohesive environment was not easy. As you would expect, each BU wanted equal exposure as well as representation through out the site. These expectations had to be implemented within the product catalogs and site searches as well.</p>
<h3 id="products">Products</h3>
<p>If you look at the <a href="http://commercial.firestone.com/en-us/solutions/agriculture/product-search">Agriculture product search page</a>, you will notice a slight flicker during the load the page's initial load. That is because the list of products is being retrieved from an API endpoint supplied by an AEM service and then is parsed using AngularJS. The array of results is rendered within the DOM using a stereo typical AngularJS <em>ng-repeat</em>.</p>
<p>I do want to point out that if you regard the &quot;Network&quot; tab in your Developer Tools, a new API call is made every single time you filter/search your content. Once the new call is made, you will see your content change beneath it. Additionally, you will also see that the &quot;Off-the-Road&quot; and &quot;Truck &amp; Bus&quot; buttons take you to a completely different page. This is because the available filters change from BU to BU.</p>
<h3 id="sitesearch">Site Search</h3>
<p>Other than the autocomplete function that is available within the search overlay that appears upon click the &quot;Search&quot; button within the navigation, the full site search is done &quot;the old fashioned&quot; way. This consists of taking the query string, passing it to a class that then runs it through a series of X-Path queries and then returns the results. We currently do have a few additional methods in place to apply weighting to the results that improve the accuracy of the search. Because of the lack of front-end API calls, this does mean that each returned result does in fact hit the application server and cannot be cached by the dispatcher.</p>
<h2 id="bringingittothefrontwithbridgestonecommercial">Bringing it to the front with Bridgestone Commercial</h2>
<p>The functionality of both the product filtering and site search for the <a href="http://commercial.bridgestone.com">Bridgestone Commercial website</a> took a bit of technological evolutionary turn when it came to implement these features. While both of these components still made use of the API endpoints much like that of Firestone's product catalog, the view was not required to reload; allowing for a more immediate and optimized experience.</p>
<h3 id="products">Products</h3>
<p>Initially when the <a href="http://commercial.bridgestone.com/en-us/products">Bridgestone Commercial product catalog</a> was constructed, all of the product data for the Bridgestone product catalog was loaded into the page at once. Just to clarify, when I say all of the product data, <strong>I mean all of it</strong>. So the opening page load included the products for both the construction and truck business units, as well as all of the data necessary for the filters. We then used a pretty complex AngularJS application to parse and sort the data. The major difference with this implementation versus Firestone was the ability to serve product data all on the same page for all of the business units, as well as dynamically modify the available filters based on the visible products.</p>
<h4 id="version11">Version 1.1</h4>
<p>Soon after the website's launch we started to receive some internal requests regarding the product catalog. The creative team was not happy with the animations, or lack thereof, when a filter was changed. The organic search team was also complaining about the lack of SEO on the page. This was because none of the product data till after the javascript had rendered the app. Unfortunately Google does not effectively crawl single page javascript applications. As if that wasn't enough, we were receiving reports of some performance issues within the page. The web page was being forced to store a massive amount of information on load. This was causing users with a lesser machine or mobile device some interaction delays when filtering the product information. Obviously these detriments to the user experience were unacceptable.</p>
<p>After some brainstorming, my solution was to render all of the products and their respective data within the AEM Sightly template outside of Javascript. This provided a number of solutions and optimizations. First, because the product data would be rendered even without Javascript, the production information would be available for Google to crawl. Second, since this cut down on the volume of data the page had to hold in memory, our Javascript performance improved significantly. Finally and most importantly, our creatives were satisfied with the introduction of filtering animations. The recently achieved optimizations allowed us to add CSS3 animations to all of the products to polish up the filtering process and experience.</p>
<h3 id="sitesearch">Site Search</h3>
<p>The site search was highly simplified due to the fact that we moved all of the functionality to a servlet with an API endpoint that would allows to utilize GET requests and receive results based on the query parameter. These results would then be rendered using a simple AngularJS application. Much like that of Firestone, we used additional methods to add weighting to the returned search results.</p>
<h2 id="theprosandcons">The pros and cons</h2>
<h3 id="firestonecommercial">Firestone Commercial</h3>
<p><em>Using the server to render the results</em></p>
<p><strong>The Pros</strong></p>
<ul>
<li>Limited Javascript/AngularJS needed</li>
<li>Results can be easily optimized for search</li>
<li>Allows for large amounts of data</li>
</ul>
<p><strong>The Cons</strong></p>
<ul>
<li>The search must by-pass the AEM dispatcher</li>
<li>Large amounts of traffic can be hard on an application server without the proper load balancing</li>
<li>Requires multiple pages for customized experiences</li>
</ul>
<h3 id="bridgestonecommercial">Bridgestone Commercial</h3>
<p><em>Using Javascript/AngularJS to render and filter the results</em></p>
<p><strong>The Pros</strong></p>
<ul>
<li>Appropriate SEO can be achieved if data is rendered correctly</li>
<li>Can dynamically change the filtering experience based on user interaction</li>
<li>All but eliminates impact on the application server; the dispatcher does the heavy lifting</li>
</ul>
<p><strong>The Cons</strong></p>
<ul>
<li>Extremely complex Javascript/AngularJS applications</li>
<li>Can result in &quot;chunky&quot; animations and user experience</li>
<li>If not treated correctly, content will not be SEO-friendly</li>
</ul>
<h2 id="wrappingup">Wrapping Up</h2>
<p>Each implementation that we have used yielded their own unique results; both good and bad. Like I said above, each team and project will need to weigh these pros and cons based on the project's needs, the talent and time available, and the acceptable user experience debt. In the end, the decision is up to you!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Some hurdles when using Angular 2 in an existing app]]></title><description><![CDATA[Angular 2 by itself is a great platform for client side applications. But what if you want to include with something? Here are some issues that I've found.]]></description><link>https://blog.dniccumdesign.com/some-hurdles-when-using-angular-2-in-an-existing-app/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4b6</guid><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Tue, 27 Sep 2016 20:23:33 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>At this point, even if you have the slightest interest in front end development, you don't have to look very far to find that Angular 2 has been officially released. Now, if you have had some time to mess around with <a href="https://angular.io/">Angular 2</a>, in terms of asset management and file inclusion, I'm sure that you have quickly realized that it isn't like Angular 1. Not even close.</p>
<h2 id="thefiledifference">The file difference</h2>
<p>If you aren't sure what I'm talk about, here's why at a very high level.</p>
<p>Angular 2, unlike that of Angular 1, is no longer a plug-and-play type of library; like <a href="https://jquery.com/">jQuery</a> or <a href="http://mootools.net/">Mootools</a>. With these libraries you could include one or two files into the page and <em>BOOM!</em> you have a working Javascript framework. Sadly those days are long gone. Because stock Angular 2 is written in <a href="https://www.typescriptlang.org/">Typescript</a>, your Javascript now needs to be compiled or your browser won't be able to parse it. In order to help us out from having to manually including each file <em>(ugh)</em>, Angular 2 uses a package called <a href="https://github.com/systemjs/systemjs">SystemJS</a> to dynamically load all of the available Javascript files into your application. I can speak from experience, when you are working with large amounts of Javascript files that commonly occurs in Angular 2, this is an extreme luxury.</p>
<h2 id="luxurycomeswithaprice">Luxury comes with a price</h2>
<p>As you would expect, great luxury comes at a great price. The price that I'm referring to is a few unspoken <em>gotchas</em> that came up and consequently lead to hours of frustration.</p>
<h3 id="minification">Minification</h3>
<p>After a few different configurations and experiments, I have concluded that the Angular 2 dependencies, which include SystemJS, <a href="https://github.com/angular/zone.js/">ZoneJS</a> and others cease to work when you try to minify these libraries; this also applies to any and all of the compiled Javascript files from your Typescript. A required detour for this issue is to manually include these packages within your application.</p>
<p>Most importantly however is the unfortunate bi-product of these manual includes that increases file size bloat and obviously load times.</p>
<h3 id="typings">Typings</h3>
<p>So this concept took me a little while to get a grasp of. Because of how Typescript scrutinizes every bit of Javascript you write, while this isn't necessarily a bad thing, it can be difficult incorporating additional libraries like jQuery and <a href="http://materializecss.com">Materialize</a>. For instance if I wanted to use a Materialize toast within my code, I would normally use it like so:</p>
<pre><code>toast('Your password is wrong', 3000, 'red');
</code></pre>
<p>Your Typescript compiler would scream and fail claiming that it doesn't recognize the <code>toast</code> variable or its parameters. This is because as far as Typescript is concerned, the Materialize or jQuery objects don't exist in it's namespace; almost as if you had never imported them. Typings tell Typescript what is available and what types of parameters should be allowed. <a href="https://github.com/DefinitelyTyped/DefinitelyTyped/blob/master/materialize-css/materialize-css.d.ts">See this code for an example</a></p>
<h3 id="directoryimports">Directory imports</h3>
<p>When I am scaffolding a new project More often then not, I use something like <a href="http://yeoman.io/">Yeoman</a> or the <a href="https://cli.angular.io/">official Angular 2 CLI</a> to build all of the necessary assets for me. Why? I'm lazy and I would probably miss something if I tried to implement them myself every single time. <em>(and proud of it)</em></p>
<p>On occasion though I do have scaffold my projects myself. As I quickly found out, much like I outlined above, when you want to import a new Angular 2 dependency, you need the entire directory. This means that you will have to add a couple lines to your <code>Gulpfile.js</code> or <code>Gruntfile.js</code> to copy the entire directory (index.js and all other files included) to the directory where Angular 2's boot process happens. This can provide troublesome when you use NPM to update the projects dependencies. Sometimes your initializer will not fully overwrite these dependencies and then you will need to manually remove your <code>node_modules</code> directory and do a fresh install. Of course you get all sorts of pretty errors when that happens.</p>
<p>This isn't necessarily an issue, it's just a nuisance.</p>
<h2 id="takingthegoodwiththebad">Taking the good with the bad</h2>
<p>I want to disclose that I do not even remotely consider myself a pro on Angular 2 yet. I'm still trying to figure out how to use it to it's fullest potential like most others.</p>
<p>With all of that said, I do see a lot of power in Angular 2 and I am excited to see where it goes. Hopefully some of my early frustrations and mistakes will prevent you from doing the same.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[The end is nigh for Bower?]]></title><description><![CDATA[The NodeJS package known as Bower has provided developers with a great set of tools to quickly scaffold their projects. But is it on the way out?]]></description><link>https://blog.dniccumdesign.com/the-end-is-nigh-for-bower/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4b5</guid><category><![CDATA[editorial]]></category><category><![CDATA[javascript]]></category><category><![CDATA[bower]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Thu, 22 Sep 2016 19:53:16 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2016/09/bower-graveyard.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://blog.dniccumdesign.com/content/images/2016/09/bower-graveyard.jpg" alt="The end is nigh for Bower?"><p>Over the last four or five years, I have come to have a deep adoration for the front end package manager <a href="https://bower.io/">Bower</a>. Going to the official jQuery or Angular repositories to download dependencies was a thing of the past. The even deeper integrations with scaffolding tools like Yeoman even made the attraction even more irresistible. The fact I could build a basic project with a couple shell commands just blew my mind the first few times.</p>
<p>Now that Angular 2 has <strong>finally</strong> been released and is no longer in the purgatory of beta or release candidate, I am starting to see a trend with the support of public projects and the technologies that supports them. This trend is that <strong>these new projects are not supported by Bower.</strong></p>
<p>While I would like to note that these trends are somewhat specific to brand new projects like that of Angular 2 and <a href="http://ionicframework.com/docs/v2/getting-started/installation/">Ionic 2</a>. Projects like Angular 1 and Ionic 1 that are currently being maintained, though not improved, are still supported with Bower compatible packages; I digress...</p>
<p>But why would I be thinking about this? Is there something that Bower did wrong to seal it's demise? I have a few ideas.</p>
<h2 id="itsanodejspackage">It's a NodeJS package</h2>
<p>So what if it is a NodeJS package? We use helpers like Grunt and Gulp all of the time to process code and to make our lives easier as developers. Why would Bower be any different? Here's my theory: because it is a required dependency to install further dependencies with its own set of shell commands; <em>automation</em> in other words.</p>
<p>When working with CMS applications like <a href="http://www.adobe.com/marketing-cloud/enterprise-content-management.html">Adobe Experience Manager</a> or even platforms like <a href="https://laravel.com/">Laravel</a> (PHP-based), these tools ship with scaffolding tools or provide a tool for you to create your own. The majority of these assets require you to use Node Package Manager (npm) to install and run their associated dependencies. It has become so common that it is second-nature for us as developers to clone a repository and then run <code>$ npm install</code>. Configuring AEM to run this command isn't very hard either. The issues that I can identify lies in the fact that:</p>
<ul>
<li>Bower downloads is packages differently than NodeJS does</li>
<li>It has a different set of configurations and requires additional files to apply these configurations</li>
<li>It is discouraged to store your Bower-related files or <em>bower_components</em> in the same place as your node modules. Bad things start to happen when you do</li>
</ul>
<p>Each of these points could become troublesome if you are trying to develop in an existing ecosystem that was never designed to support Bower.</p>
<p>Additionally, in most instances you have to have Bower installed globally on your machine. If not, for instance it is a dependency of your NodeJS application, then you are limited to calling the Bower command like so:</p>
<pre><code>/usr/local/bin/bower install
</code></pre>
<p>Now you are liable to manage two different sets errors and feedback. And as I pointed out earlier, if this Bower project has been automated, debugging these errors can be troublesome.</p>
<h2 id="thelackofupdates">The lack of updates</h2>
<p>If you look at the <a href="https://github.com/bower/bower/releases">Bower repository release page</a>, you can see that the platform hasn't been touched (at the time of this article being written) since April. A platform that was as widely used as this, and when you throw in the recent macOS update of Sierra, the lack of updates should come as a bit of an alarm to you.</p>
<h2 id="nonewprojectadaptation">No new project adaptation</h2>
<p>As I said above, if you are familiar with Angular 2, Ionic 2, or even <a href="http://foundation.zurb.com/">Zurb's Foundation</a>, currently <strong>none</strong> of these projects have developed Bower components. It is also worth touching on that these three projects also have a corresponding command line interface (CLI) to be able to initializes or compile its assets. As you would expect, these CLI's using NodeJS to install their dependencies. With not relying on Bower to install their dependencies applications such as these can scaffold a project without having to call additional shell commands; resulting in both supplementary speed and stability within the applications.</p>
<h2 id="justafewthoughts">Just a few thoughts</h2>
<p>As I draw this <em>techy</em> rant to a close, keep in mind that I am basing my conclusions purely off of observations and some basic research. I proudly disclose that I have not spoken to the Bower guys about the longevity of their project, or do I have any usage metrics to show a decline. Again, just a few thoughts that I tied together into a hypothesis. For all I know, Bower could be alive and well.</p>
<p>But nevertheless, we as developers have to take note of this evidence and begin to make decisions when we are scaffolding a new project or making enhancements to existing projects.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Adding Babel support to an AEM instance]]></title><description><![CDATA[<!--kg-card-begin: markdown--><p>This exposé will outline the steps to be able to add Babel support to an existing AEM instance.</p>
<h2 id="whatisbabel">What is Babel</h2>
<p><a href="https://babeljs.io/">Babel</a> is a compiler/processor of &quot;futuristic&quot; javascript and relative libraries like JSX and Flow. Babel provides you with the tools, whether it be a pre-processor (like</p>]]></description><link>https://blog.dniccumdesign.com/adding-babel-support-to-an-aem-instance/</link><guid isPermaLink="false">5e90a9c532e37540c4e8c4b4</guid><category><![CDATA[frontend]]></category><category><![CDATA[javascript]]></category><category><![CDATA[gulp]]></category><category><![CDATA[tutorial]]></category><dc:creator><![CDATA[Doug Niccum]]></dc:creator><pubDate>Mon, 01 Aug 2016 18:50:10 GMT</pubDate><media:content url="https://blog.dniccumdesign.com/content/images/2016/08/20150914160024_567.png" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://blog.dniccumdesign.com/content/images/2016/08/20150914160024_567.png" alt="Adding Babel support to an AEM instance"><p>This exposé will outline the steps to be able to add Babel support to an existing AEM instance.</p>
<h2 id="whatisbabel">What is Babel</h2>
<p><a href="https://babeljs.io/">Babel</a> is a compiler/processor of &quot;futuristic&quot; javascript and relative libraries like JSX and Flow. Babel provides you with the tools, whether it be a pre-processor (like Gulp or Grunt) plugin or a command-line interface, to compile <a href="http://es6-features.org/">ECMAScript 6</a> javascript into legacy browser compatible ECMAScript 5 javascript.</p>
<p>For example, one of the new ways that ES6 represents a function is like this:</p>
<pre><code>let functionName = () =&gt; {
	console.log('does something');
};
</code></pre>
<p>While this Javascript above would run completely fine in an up-to-date version of Chrome or Firefox, this would in turn cause all sorts of syntax errors in legacy versions of IE. So what Babel does is turn this:</p>
<pre><code>let functionName = () =&gt; {
	console.log('does something');
};
</code></pre>
<p>into this:</p>
<pre><code>var functionName = function() {
	console.log('does something');
};
</code></pre>
<p>That's much better. While the changes are pretty subtle in this example, these changes result in valid, working Javascript versus unrecognizable, broken Javascript.</p>
<p><strong>NOTE:</strong> For the sake of this example, we will be using the Babel ECMAscript 2015 preset plugin. For more information on plugins, <a href="https://babeljs.io/docs/plugins/">refer to this page</a>.</p>
<h2 id="step1settingupyourmavenbuildscript">Step 1 - Setting up your Maven build script</h2>
<p>First we are going to set up the Maven build script. This script will run during a normal Maven build.</p>
<p>Within your AEM installation you will have a directory within the <code>clientlibs</code> directory called <code>development</code>. <em>It is worth noting that the names of these directories could change from project to project.</em> This directory will house all of your SASS/SCSS, fronted-build scripts, and fonts if you have any. Create a directory called <code>compile-js</code>. Next we are going to create these three files and their associated contents:</p>
<p><strong>.gitignore</strong></p>
<pre><code>node_modules
</code></pre>
<p><strong>gulpfile.js</strong></p>
<pre><code>var gulp,
  srcFolder,
  jsBaseFolder,
  babel,
  es2015,
  rename;

gulp = require('gulp');
babel = require('gulp-babel');
es2015 = require('babel-preset-es2015');
rename = require('gulp-rename');

srcFolder = '../../';
jsBaseFolder = srcFolder + 'foot/js/';

// THIS DEFINES THE GULP TASK
gulp.task('compile-js', function() {
  return gulp.src([jsBaseFolder + '**/*.js', '!' + jsBaseFolder + '_compiled/**', '!' + jsBaseFolder + 'vendor/**'])
    .pipe(babel({
      &quot;presets&quot;: [es2015]
    }))
    .pipe(rename({dirname: ''}))
    .pipe(gulp.dest(jsBaseFolder + '_compiled'))
});

// THIS RUNS THE TASK FIRED BY THE MAVEN BUILD
gulp.task('build', ['compile-js']);
</code></pre>
<p><strong>package.json</strong></p>
<pre><code>{
  &quot;name&quot;: &quot;maven-babel-build&quot;,
  &quot;version&quot;: &quot;0.1.0&quot;,
  &quot;description&quot;: &quot;Gulp build that compiles ES6 javascript using Babel.&quot;,
  &quot;main&quot;: &quot;gulpfile.js&quot;,
  &quot;scripts&quot;: {
    &quot;gulp&quot;: &quot;gulp&quot;,
    &quot;build&quot;: &quot;gulp build&quot;
  },
  &quot;author&quot;: &quot;&quot;,
  &quot;license&quot;: &quot;ISC&quot;,
  &quot;dependencies&quot;: {
    &quot;babel-preset-es2015&quot;: &quot;6.9.0&quot;,
    &quot;gulp&quot;: &quot;3.8.11&quot;,
    &quot;gulp-babel&quot;: &quot;6.1.2&quot;,
    &quot;gulp-rename&quot;: &quot;^1.2.2&quot;
  }
}
</code></pre>
<p>The combination of these three files installs the necessary NodeJS dependencies, sets up the Gulp task that Maven will run during the build process, and the gitignore provides an extra layer of protection from accidentally adding the compiled files to your repository.</p>
<p><strong>NOTE:</strong> You may need to change the <code>srcFolder</code> and <code>jsBaseFolder</code> based on the names and locations of your development directories.</p>
<h2 id="step2addingyourcompilationdestinationdirectory">Step 2 - Adding your compilation destination directory</h2>
<p>Next we need to create a directory that will be location that Babel will store all of the compiled Javascript and will be served to AEM from. This directory will be created within the <code>js</code> directory that is located within the <code>foot</code> directory. Once you have found the right folder, create a new directory named <code>_compiled</code> (just like in the gulp file up above) and within the newly created directory, create another <code>.gitignore</code> file with the following contents:</p>
<pre><code>*
# Except this file
!.gitignore
</code></pre>
<p>This will allow Git to add the directory to the repository but ignore the directory's contents. This folder has to be present of the Gulp scripts will not run.</p>
<h2 id="step3addingthemavenprofiletotheprojectsconfiguration">Step 3 - Adding the Maven profile to the project's configuration</h2>
<p>Now that we have the Gulp script set up, we now need to write a task/profile for Maven to actually run the task during it's build process. Before we add the profile, we need to make sure that the the Maven Antrun plugin is a dependency of the project. Open up the <code>pom.xml</code> directly within the root of the project (not the component <code>pom.xml</code>). Within the build -&gt; pluginManagement -&gt; plugins nodes, add the following snippet:</p>
<pre><code>&lt;plugin&gt;
  &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
  &lt;artifactId&gt;maven-antrun-plugin&lt;/artifactId&gt;
  &lt;version&gt;1.7&lt;/version&gt;
&lt;/plugin&gt;
</code></pre>
<p>This adds the Maven Antrun plugin as a project dependency. Now that this is complete, we need to add the build profile that will compile the ECMAscript using Babel during a Maven build. This will need to be done within the <code>pom.xml</code> file that is located within the components directory.</p>
<p>If you go ahead and open up this file, scroll down to the <code>profiles</code> node, and within this node, paste the following code:</p>
<pre><code>&lt;profile&gt;
	&lt;id&gt;compile-es6-javascript&lt;/id&gt;
	&lt;build&gt;
	    &lt;plugins&gt;
	        &lt;plugin&gt;
	            &lt;groupId&gt;org.apache.maven.plugins&lt;/groupId&gt;
	            &lt;artifactId&gt;maven-antrun-plugin&lt;/artifactId&gt;
	            &lt;executions&gt;
	                &lt;execution&gt;
	                    &lt;id&gt;exec-gulp-compile-es6-javascript&lt;/id&gt;
	                    &lt;phase&gt;generate-resources&lt;/phase&gt;
	                    &lt;goals&gt;
	                        &lt;goal&gt;run&lt;/goal&gt;
	                    &lt;/goals&gt;
	                    &lt;configuration&gt;
	                        &lt;tasks&gt;
	                            &lt;echo message=&quot;Compiling valid ECMAscript 6 with Babel&quot; /&gt;
	                            &lt;property name=&quot;antExecDir&quot; value=&quot;${project.build.directory}/../src/main/content/jcr_root/etc/clientlibs/{THE PATH TO THE DEVELOPMENT DIRECTORY}/development/compile-js/&quot; /&gt;
	                            &lt;echo message=&quot;Exec dir: ${antExecDir}&quot; /&gt;
	                            &lt;exec dir=&quot;${antExecDir}&quot; executable=&quot;npm&quot; spawn=&quot;false&quot; failonerror=&quot;true&quot;&gt;
	                                &lt;arg value=&quot;install&quot; /&gt;
	                            &lt;/exec&gt;
	                            &lt;exec dir=&quot;${antExecDir}&quot; executable=&quot;gulp&quot; spawn=&quot;false&quot; failonerror=&quot;true&quot;&gt;
	                                &lt;arg value=&quot;build&quot; /&gt;
	                            &lt;/exec&gt;
	                        &lt;/tasks&gt;
	                    &lt;/configuration&gt;
	                &lt;/execution&gt;
	                &lt;execution&gt;
	                    &lt;id&gt;clean-compiled-javascript&lt;/id&gt;
	                    &lt;phase&gt;clean&lt;/phase&gt;
	                    &lt;goals&gt;
	                        &lt;goal&gt;run&lt;/goal&gt;
	                    &lt;/goals&gt;
	                    &lt;configuration&gt;
	                        &lt;tasks&gt;
	                            &lt;echo message=&quot;Removing the old compiled EcmaScript 6 files&quot; /&gt;
	                            &lt;property name=&quot;antCleanDir&quot; value=&quot;${project.build.directory}/../src/main/content/jcr_root/etc/clientlibs/{THE PATH TO THE FOOT DIRECTORY}/foot/js/_compile&quot; /&gt;
	                            &lt;delete file=&quot;${antCleanDir}/**&quot; /&gt;
	                            &lt;echo message=&quot;Done removing the old javascript&quot; /&gt;
	                        &lt;/tasks&gt;
	                    &lt;/configuration&gt;
	                &lt;/execution&gt;
	            &lt;/executions&gt;
	        &lt;/plugin&gt;
	    &lt;/plugins&gt;
	&lt;/build&gt;
&lt;/profile&gt;
</code></pre>
<p>If you are using an IDE like <a href="https://www.jetbrains.com/idea/">IntelliJ</a> or <a href="https://eclipse.org/downloads/">Eclipse</a>, this will effectively add an option for you to run the &quot;compile-es6-javascript&quot; task during your Maven build process. <strong>Make sure you enable this option</strong>, otherwise when you refresh the page, there will not be any Javascript present.</p>
<h2 id="step4addingtheprofiletojenkinsoptional">Step 4 - Adding the profile to Jenkins (optional)</h2>
<p><strong>NOTE:</strong> If you don't use Jenkins for deployment automation, skip this step.</p>
<h3 id="toaddtoastandardbuild">To add to a standard build</h3>
<p>Login to Jenkins, go to the specific job that you would like to enable the newly created &quot;compile-es6-javascript&quot; profile for. Then open up the &quot;Configure&quot; page and scroll down to the Build section. It should look something like this:</p>
<p><img src="https://blog.dniccumdesign.com/content/images/2016/08/Screen-Shot-2016-08-01-at-1-27-17-PM.png" alt="Adding Babel support to an AEM instance"></p>
<p>Notice the &quot;Goals and options&quot; input. The new profile that we just created was inserted in the comma delimited list of dependencies. This will tell Jenkins to run the profile during the build process.</p>
<h3 id="toaddtoastagerelease">To add to a stage release</h3>
<p>If you are adding this new profile to a Maven stage release, the configurations are slightly different to prevent Maven from throwing some annoying errors complaining about snapshots missing.</p>
<p>The value that we are going to change resides within the <strong>Build Environment</strong> section, versus the <strong>Build</strong> section that is listed above. There you will add the same &quot;compile-es6-javascript&quot; value to the <strong>Release goals and options</strong> input. So it will look something like this:</p>
<pre><code>-Dresume=false clean release:clean release:prepare release:perform -Pinstall-into-6.1stage,aem-6.1-deps,compile-es6-javascript,autoInstallPackageAuthor -Darguments=&quot;-Pinstall-into-6.1stage,aem-6.1-deps,compile-es6-javascript,autoInstallPackageAuthor&quot;
</code></pre>
<p>This is extremely important that this is only entered here to prevent Jenkins from failing due to missing snapshot .jar files.</p>
<h2 id="wrappingitup">Wrapping it up</h2>
<p>Before we tie a bow on this, I can understand that all environments are different, paths/directories can be in different places and named differently. However I hope that the spirit of this workflow is detailed enough for you to implement in future projects.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>