Quantcast
Viewing all 263 articles
Browse latest View live

We want you on stage at Meet Magento Croatia!

Thinking about whether you should join us for Meet Magento Croatia? If you haven’t thought about it by now, you should definitely start! Here’s why…

Meet Magento Croatia is part of the global family of Meet Magento events which gathers eCommerce experts from all over the world. One, two or three-day events serve as the best possible connection platforms for getting all the latest info from the Magento world.

And not only that – they are the perfect place where eCommerce trends are discussed and hot Magento topics tackled. Developers Paradise we organized in April proved, once again, that events channel the power of community. In fact, that’s what makes them special!

How can you join?

By signing up, of course. We’d like to see you as a speaker or, alternatively, as a delegate. We know many of you have a lot of miles in your Magento shoes and we’d like to hear all about the roads you’ve been on. It’s no secret you’ve breathed life into numerous web shops – now it’s the time to shine and share with community how you did it.

Image may be NSFW.
Clik here to view.
mm17hr-tekst

We are interested in hearing your story (inside out)!

Along the way, you overcame different obstacles and bridged various challenges – by sharing your story, you are empowering the community to grow. We all want to improve, we would highly appreciate you telling us how exactly you are doing it – day in, day out.

The only thing is…

There’s not much time left to apply so you better hurry!

Final deadline for getting your applications in is December 31st. We’ll make sure to review all of your applications and get back to you as soon as possible. At this point, we can promise you, if you make it to the #MM17HR stage, you’ll be around some pretty amazing Magento and eCommerce experts. 😉 We’ll be dropping names soon, so make sure to stay tuned to see who we’ll host.

So, come on – tell us what you did, and more importantly, how did you do it and we’ll get you on the MM17HR stage. 

See you in Osijek, Croatia, home of Inchooers!

The post We want you on stage at Meet Magento Croatia! appeared first on Inchoo.


How to set up a CDN (Amazon CloudFront) in Magento

If you are using Amazon AWS CloudFront and you want to set up CDN in your Magento, this tutorial is going to take you step by step on how to set up your CloudFront and Magento to get it both working.

STEP 1 – DISTRIBUTIONS

The first thing you have to do is to set up Distribution on CloudFront. Below you can see an example of my custom Distribution.

Image may be NSFW.
Clik here to view.
CloudFront Distributions

This is how general options should look like, of course you can edit them to suit your needs.

Image may be NSFW.
Clik here to view.
CloudFront general tab

As you may see here:Image may be NSFW.
Clik here to view.
CloudFront certificate

Amazon’s CloudFront uses its own certificate so if your site uses SSL you can just use CloudFronts’s domain name such as “https://xyz13zxy.cloudfront.net”. You can also use custom CDN url with SSL such as “https://cdn.yoursite.com” but then you’ll have to import your own certificate.

Two very important options you need to be careful about are:

Alternate Domain Names (CNAMEs): this is used if you want nicer URLs for your content. So for example, if your CNAME is “cdn.yoursite.com” you will have to do some configuration in your cPanel of your own server to make it work and instead of, for example “http://www.yoursite.com/images/image.jpg” it’s going to be “http://cdn.yoursite.com/images/image.jpg”

Domain Name: This is a domain name Amazon generates for you, for example: “xyz13zxy.cloudfront.net”. If you don’t care about URL for your CDN this is faster and easier way to set up CDN. So instead of, for example “http://www.yoursite.com/images/image.jpg” it’s going to be “http://xyz13zxy.cloudfront.net/images/image.jpg”.

So how Amazon knows which images to use if you are trying to access it via “xyz13zxy.cloudfront.net”. This is where the Origins come in.

STEP 2 – ORIGINS

Now you have to set Origins. Amazon’s official documentation says: “When you create or update a distribution, you provide information about one or more locations—known as origins—where you store the original versions of your web content. CloudFront gets your web content from your origins and serves it to viewers. Each origin is either an Amazon S3 bucket or an HTTP server, for example, a web server”.

Origin Domain Name: this is your source domain name, for example “www.yoursite.com

Origin ID: this is some ID you specify by yourself just to easily identify this Origin. It can be something similar to your CDN domain name.

Origin Path: it’s empty. Amazon’s official documentation says: “Optional. If you want CloudFront to request your content from a directory in your Amazon S3 bucket or your custom origin, enter the directory name here, beginning with a /. CloudFront appends the directory name to the value of Origin Domain Name when forwarding the request to your origin, for example, myawsbucket/production.”

Image may be NSFW.
Clik here to view.
CloudFront Origins

Image may be NSFW.
Clik here to view.
CloudFront Origins

STEP 3 – BEHAVIOURS

The next thing you have to do is to set Behaviours. Amazon’s official documentation says: “A cache behaviour lets you configure a variety of CloudFront functionality for a given URL path pattern for files on your website. For example, one cache behaviour might apply to all .jpg files in the images directory on a web server that you’re using as an origin server for CloudFront.

When using CDN sometimes you might get an error on your website similar to this one: “Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://xyz123zxy.cloudfront.net/images/images.jpg This can be fixed by moving the resource to the same domain or enabling CORS.”

To fix this problem you have to set Behaviours.

Origin: It’s very important to choose the right Origin from the dropdown box. This is your Origin you set up in the previous step.

Whitelist Headers: choose “Origin” from the left box and click on Add >>. When you choose your other options click on Create/Edit.

Path Pattern: This is path to a file you want to allow CORS for. I’m using “*” so it matches all files and allows Cross-origin resource sharing.

Image may be NSFW.
Clik here to view.
CloudFront Behavior

STEP 4 – INVALIDATIONS

You might be wondering how and when the cache is going to be invalidated or cached again. By default, each object automatically expires after 24 hours.

From Amazon’s official documentation: To change the cache duration for all objects that match the same path pattern, you can change the CloudFront settings for Minimum TTL, Maximum TTL, and Default TTL for a cache behaviour. For information about the individual settings, see Minimum TTL, Maximum TTL, and Default TTL. To use these settings, you must choose the Customize option for the Object Caching setting.

If you want to invalidate cache for all files (which is not recommended) or just for one specific file, you have to set “Invalidations”.

Enter path to your own files and click on Invalidate.

Image may be NSFW.
Clik here to view.
CloudFront Cache Invalidation

STEP 5 – MAGENTO SETTINGS

This was CloudFront’s side, now it’s time to configure Magento’s settings. It’s really easy to configure Magento. The only thing you have to do is to configure Magento’s URLs. Screenshots below are using SSL (HTTPS) so I used “https://…” everywhere. If you don’t use SSL, but regular HTTP, then you should use ”http://…”.

As you may see, nothing has changed in Default settings.

Image may be NSFW.
Clik here to view.
Magento Website Settings

Choose your Website from the dropdown and change only fields where you want to set CDN URL. I chose to set CDN only for images and CSS files.

Image may be NSFW.
Clik here to view.
Magento Website Settings

Save your settings, clear your cache and test your website. How do you know if your site uses CDN now? Open your website (if you are using Chrome) and press CTRL and U on your keyboard to open the source code. If you can find something like “.cloudfront.net” or “cdn.yoursite.com” and all your content is loading properly that means CDN is set up properly.

Image may be NSFW.
Clik here to view.
Magento CDN code

The post How to set up a CDN (Amazon CloudFront) in Magento appeared first on Inchoo.

How many Magento certified people do you know?

How many Magento certified people are out there? 5,288 to date, according to the official data at everyone’s disposal. What about their distribution – where do they come from, what countries have the most certified individuals?

In this post you’ll find an overview of one interesting metric – Magento certified individuals per capita. Who tops the charts and where does your country stand? Read on to find out.

We’ve seen many different rankings in the Magento world, and thanks to Ben Marks and one of his recent tweets, I took it upon myself to dig just a bit deeper around the number of Magento certified people (totals include developers and solution specialists) per capita.

The initial conversation was about whether Netherlands or Croatia are leading these rankings globally, but as it turns out after I took a closer look at the official numbers, there are actually countries doing way better than us.

Why per capita?

Ok, but why use this metric in the first place? Well, it’s rather difficult to compare countries using absolute numbers and if we combine absolutes with something like this (number of Magento certified people per 1M of general population in each country) and then analyze the data, we can see some interesting trends and put things in a different perspective.

For example, we can see how in some countries the efforts of several individuals and companies can really make a difference and impact their local communities – and this is exactly what the metrics in this case are showing or hinting at.

The methodology behind this one was rather simple and straightforward:

    1. Go to official Magento certification directory
    2. Select country after country and simply collect these figures
    3. Then use some type of comparable statistics on population per country (rather difficult to find as each country has their own methods but I chose this one – the data set used was of December 29th, 2016)

Limitations of this method

It’s impossible to see people who have opted not to be visible in the directory, so the numbers are not 100% representative of the real situation with certifications.

Another one is – there are people (app. 140 out of 5288 listed) who don’t have any country associated with them, so these have not made the numbers.

And finally – smaller countries can improve their position more easily. But then again, there are only two countries with less than 1M people in Top 50 so that’s not that big of an issue.

You can check out some of the hard data below, but for the sake of this post, I’ll focus more of the analysis on the thing I found most interesting – finding out just how many certified Magento people per 1M of the total population in each of the countries globally.

The “hard data”

  • there are 5288 certified individuals in Magento directory
  • they come from 73 countries
  • 7 countries have only one certified individual
  • 11 countries have more than 100 certified people
  • USA has the highest number of certified individuals – 849

Baltic countries top the charts

Image may be NSFW.
Clik here to view.
magento-certified-per-1m

While it may come as a surprise that Latvia is the one leading the pack, Scandiweb team have been doing an amazing job over there in the last years, getting a lot of their employees certified in the process, and building on the community in this Baltic country. We’ve also gotten quite fond of them as they sent the largest delegation to Developers Paradise 2016 in Opatija so we’re happy to see our Latvian friends in the significant lead with 41,4 Magento certified individuals per one million people.

In not that close second, we’ll find Malta. Now, this one is the only really debatable country to make it into Top 5 as it doesn’t make sense to say they have 30,3 people per 1M if they have the total of 13 people certified (and the total population of less than 500k). They are also the only country besides Montenegro to have at least one person certified and the population of less than 1M, so they do deserve to get some credit in any case.

We can either take them out of the chart or change the metric to certified per 100k people (which we can do, or not), but after them there’s another Baltic country, a Latvian neighbour – Estonia – with 20,5 Magento certified people per 1M.

Yes, it helps that these three countries topping the charts have 1-2 million people (or much less in case of Malta), but still – the smaller the country, the smaller the pool of people you can find those willing to dedicate their everyday work to a specific eCommerce software, so these numbers do hold value.

The Netherlands is hot and getting hotter

And then we have Netherlands – and hats off to them and everyone around an amazing community they’ve created as they are only one of two countries with >10M people who made it into Top 10. Can you guess the second one? Well, it wasn’t that difficult to spot Ukraine, right?

If you add to that one of the latest similar investigation by Willem de Groot (who many of you will know as the man behind MageReport) which shows that the Dutch are way ahead of everyone in number of Magento stores per 1M, it clearly demonstrates the strength and impact Magento and its community has had on the businesses in this country, and vice versa.

Not sure whether Magento’s official orange color had something to do with the ease of its penetration to this market, but let’s leave that aside for someone else to analyze 🙂

What about Croatia?

The reason why I started this was, of course, a chance to do some humblebragging as well, since we invested a lot of efforts of our own into Magento and local community of developers, and Croatia is in fact right there, in Top 5, with just over 12 Magento certified people per 1M, chasing after our friends from Netherlands – let’s see how the tables will turn in 2017. It would be really nice to see how our city of Osijek would rank on a similar chart if we compared cities globally, but that one is much more difficult to get right (proper data, cities vs metropolitan areas etc.).

More charts

For you who like to play around with numbers, here are some more charts to check out. Click on the thumbnails for full images.

A) USA (849) holds a slim lead over India (804) in the total number of certified individuals, and these two together take over 30% share of the global total of all Magento certified people.
Image may be NSFW.
Clik here to view.
magento-certified-per-country

B) European countries hold almost half of all certificates globally, while having just 10% of the global population.
Image may be NSFW.
Clik here to view.

C) Magento certified per 1M – a list of Top 25 countries
Image may be NSFW.
Clik here to view.
magento-certified-per-1m-top25

So, what now?

If you’re Magento certified yourself, jump on to the directory to check if you have a city/country listed next to your name to see if you made these numbers and helped your country out 🙂 If there’s something missing, there’s always the FAQ to help you sort it out.

And let’s all keep an eye out on these numbers and use them to further improve on the co-opetition within Magento ecosystem.

With these metrics, chances are we won’t have that much influence on the total population numbers, but we can all pull our weight when it comes to the certifications – we’re already preparing some of the Inchooers for new exams to get closer to and, hopefully, surpass our Dutch friends soon 🙂

Keep them coming!

The post How many Magento certified people do you know? appeared first on Inchoo.

Custom data components in Pimcore

Today I will show you how to create custom data component in Pimcore 4.4.. If you are using Pimcore, sooner or later you will end up with requirement for new form element (complex type). Data components in Pimcore are object types used for complex data modeling on object. Data component consists from data, tag Extjs scripts and one Model which is responsible for saving/loading configuration. Currently Pimcore has 19 complex data components (you can read more on https://www.pimcore.org/docs/latest/Objects/Object_Classes/Data_Types/index.html), and they will cover most requirements you will ever need. However, if (when) you’ll need a custom data component, here’s how you can create it.

Custom component

For demonstration we’ll create custom data component which will use source from custom controller and show data as select field (dropdown). Also we’ll cover how to add custom options in data script and use it on tag script. First, we’ll create plugin skeleton with basic plugin data in plugins folder and our folder will be called “Customdatacomponent”. If you want to know how to create plugin please read article “Extending Pimcore with plugins” on our blog. After we have created plugin skeleton it’s time to work on the component configuration script.

Data script

Data script is required for component configuration, and without this script our custom component will not be displayed in Pimcore. Let’s create script on location [Pimcore root]/plugins/Customdatacomponent/static/js/data-components/data/customcomponent.js, and include script in plugin configuration (plugin.xml):

<pluginJsPaths>
	<path>/plugins/Customdatacomponent/static/js/data-components/data/customcomponent.js</path>
</pluginJsPaths>

and we can write some code after that. First, we need to register pimcore.object.classes.data.customdatacomponent namespace and add new component class in pimcore.object.classes.data which will create class from pimcore.object.classes.data.data. To make it all work we also need to set type, allowed in, initialize logic, type name, icon class and group. We also need to work on getLayout method where we’ll add one custom option. But first, let’s write complete data script:

/**
 * Custom data component - configuration
 */
 
pimcore.registerNS("pimcore.object.classes.data.customdatacomponent");
pimcore.object.classes.data.customdatacomponent = Class.create(pimcore.object.classes.data.data, {
 
    type: "customdatacomponent",
 
    /**
     * define where this datatype is allowed
     */
    allowIn: {
        object: true,
        objectbrick: false,
        fieldcollection: false,
        localizedfield: false,
        classificationstore : false,
        block: true
    },
 
    initialize: function (treeNode, initData) {
        this.type = "customdatacomponent";
 
        this.initData(initData);
 
        this.treeNode = treeNode;
    },
 
    getTypeName: function () {
        return 'Custom data component';
    },
 
    getIconClass: function () {
        return "pimcore_icon_select";
    },
 
    getGroup: function () {
        return "select";
    },
 
    getLayout: function ($super) {
        $super();
 
        this.specificPanel.removeAll();
        this.specificPanel.add([
            {
                xtype: "textfield",
                fieldLabel: 'Custom option',
                name: "customoption",
                value: this.datax.customoption
            }
        ]);
 
        return this.layout;
    },
 
    applyData: function ($super) {
        $super();
        delete this.datax.options;
    },
 
    applySpecialData: function(source) {
        if (source.datax) {
            if (!this.datax) {
                this.datax =  {};
            }
        }
    }
 
});

You can see we have added Custom option (text field) in get layout method. We’ll call that value in tag script for label value. For demonstration we have set data component to be available only on objects (allowIn array). This is how it should look on data components menu:

Image may be NSFW.
Clik here to view.
customdatacomponent-1

and this is how it should look on configuration form:

Image may be NSFW.
Clik here to view.
customdatacomponent-2

Model

To save and load data into component configuration we’ll need to create corresponding model. When you click on save button in class form, our custom data component Pimcore will lookup for model in Pimcore\Model\Object\ClassDefinition\Data namespace, file with Customdatacomponent name and class Customdatacomponent which extends Model\Object\ClassDefinition\Data\Select. Now we need to create model on path [Pimcore root]/plugins/Customdatacomponent/lib/Pimcore/Model/Object/ClassDefinition/Data/Customdatacomponent.php. As you can see, starting point for autoloader in our plugin is in lib folder. Here is the model logic:

<?php
namespace Pimcore\Model\Object\ClassDefinition\Data;
 
use Pimcore\Model;
 
class Customdatacomponent extends Model\Object\ClassDefinition\Data\Select
{
 
    /**
     * Static type of this element
     *
     * @var string
     */
    public $fieldtype = "customdatacomponent";
 
    public $customoption;
 
    /** Restrict selection to comma-separated list of countries.
     * @var null
     */
    public $restrictTo = null;
 
    public function __construct()
    {
    }
 
    public function setcustomoption($customoption)
    {
        $this->customoption = $customoption;
    }
 
    public function getcustomoption()
    {
        return $this->customoption;
    }
}

We have extended Model\Object\ClassDefinition\Data\Select class because our custom data component is also select type and we already have all the logic. For custom option we’ll create class variable “customoption”, and the name matches as option added in data component. We also need setter and getter methods (setcustomoption and getcustomoption). Now we can test save and load configuration for our component on class form.

Tag script

We’ll create tag script which will be used to render data component. First, we’ll create the whole script logic and then I will explain every part. For this example we’ll cover getLayoutEdit method which is used by edit form. Create data script on location [Pimcore]/plugins/Customdatacomponent/static/js/data-components/tags/customcomponent.js

/**
 *
 */
pimcore.registerNS("pimcore.object.tags.customdatacomponent");
pimcore.object.tags.customdatacomponent = Class.create(pimcore.object.tags.select, {
 
    type: "customdatacomponent",
 
    initialize: function (data, fieldConfig) {
        this.data = data;
        this.fieldConfig = fieldConfig;
    },
 
    getLayoutEdit: function ($super) {
        $super();
 
        this.storeoptions = {
            autoDestroy: true,
            proxy: {
                type: 'ajax',
                url: "/plugin/Customdatacomponent/index/values",
                reader: {
                    type: 'json',
                    rootProperty: 'id'
                }
            },
            fields: ["id", "value", "text"],
            listeners: {
                load: function() {
                }.bind(this)
            }
        };
 
        this.store = new Ext.data.Store(this.storeoptions);
 
        var options = {
            name: this.fieldConfig.title,
            triggerAction: "all",
            editable: true,
            typeAhead: true,
            forceSelection: true,
            selectOnFocus: true,
            fieldLabel: this.fieldConfig.customoption,
            store: this.store,
            componentCls: "object_field",
            width: 250,
            style: "margin: 10px",
            labelWidth: 100,
            valueField: "value"
        };
 
        if (this.fieldConfig.width) {
            options.width = this.fieldConfig.width;
        } else {
            options.width = 300;
        }
 
        options.width += options.labelWidth;
 
        if (this.fieldConfig.height) {
            options.height = this.fieldConfig.height;
        }
 
        if (typeof this.data == "string" || typeof this.data == "number") {
            options.value = this.data;
        }
 
        this.component = new Ext.form.ComboBox(options);
 
        this.store.load();
 
        return this.component;
    }
});

After we set type and initialize class, we have created getLayoutEdit logic. First, we need to create storeoptions which are used to create store that will store pulled data from the custom controller. Data will be formatted to populate options for dropdown. For options we need to set proxy to point on url “/plugin/Customdatacomponent/index/values” and enable auto destroy and fields which will be returned by controller per item. After we create store options we’ll instantiate Ext.data.Store class. Next thing is to create options for component. Most of them are generic, but it’s important to set name (from configuration object), field label and store (instantiated store). Notice that we have set fieldLabel to value of our customoption from configuration to demonstrate how to use custom option field from data script on tag script. After options we need to create Ext.form.ComboBox, load store and return instantiated component. This is how our component looks like in object form:

Image may be NSFW.
Clik here to view.
customdatacomponent-3

Controller

Only thing we are missing to have data for tag script is custom controller. We’ll create custom controller on location [Pimcore root]/plugins/Customdatacomponent/controllers/IndexController.php

<?php
 
class Customdatacomponent_IndexController extends \Pimcore\Controller\Action\Admin
{
    public function indexAction()
    {
 
        // reachable via http://your.domain/plugin/Customdatacomponent/index/index
    }
 
    public function valuesAction()
    {
 
        $response = array(
            array(
                'id' => 0,
                'value' => 0,
                'text' => 'Value 0'
            ),
            array(
                'id' => 1,
                'value' => 1,
                'text' => 'Value 1'
            ),
            array(
                'id' => 2,
                'value' => 2,
                'text' => 'Value 2'
            )
        );
 
        return $this->_helper->json($response);
    }
}

Our logic is in valuesAction method. As you can see, we have prepared sample array with fields defined in store configuration in tag script. Important thing is to return array as json, for that we’ll use json method from our helper. Note that we are extending \Pimcore\Controller\Action\Admin class because it may contain sensitive data, so only admin should be able to access it.

Conclusion and Github

Now you have a basic knowledge on how to extend data components in Pimcore and you can additionaly improve our plugin to cover, for example, grid logic for select field. Hope this will help you to understand data components and Pimcore itself. Thanks for reading, soon we’ll cover more topics for extending Pimcore.

You can find our example plugin on https://github.com/zoransalamun/pimcore-custom-data-component

However, if you’re having trouble with Magento or Pimcore or need our technical assistance, feel free to drop us a note – we’ll be glad to check out your code and help you with development or review your site and help you get rid of any bugs there.

The post Custom data components in Pimcore appeared first on Inchoo.

Programmatically create upsell, cross sell and related products in Magento

This article will explain how to add upsell, cross sell and related products programmatically to Magento. One of practical examples would be data migration from some other ecommerce system to Magento. You can read a nice article on how to add upsell, cross sell and related products from administration here. It explains what all these product relations mean and where are they used on the site.

Load existing product data

At the beginning, there is a product that need to be updated with product relations. It needs to be loaded as usual.

$product = Mage::getModel('catalog/product')->load($productId);

This loaded product model will not contain information about already existing upsell, cross sell and related products. If loaded product doesn’t have previous upsell, cross sell or related products set, it can be saved immediately with new data. But, if there is already existing data about these products, it must be loaded first, merged with new data and then saved. There are specific functions for that. Otherwise, it would be overwritten with new data only.

$upSellProducts = $product->getUpSellProducts();
$crossSellProducts = $product->getCrossSellProducts();
$relatedProducts = $product->getRelatedProducts();

These functions load all upsell, cross sell and related product models as an array with numeric keys starting from zero.

 

Prepare existing product data

In order to update product’s upsell, cross sell and related information, they need to be rearranged in array with product ids as keys. This array should also contain information about product position as a subarray. Position parameter determines product’s order position on frontend, usually in sidebar or slider. This parameter can also be set through Magento administration by opening product’s upsell, cross sell or related tab.

foreach ($upSellProducts as $upSellProduct) {
    $upSellProductsArranged[$upSellProduct->getId()] = array('position' => $$upSellProduct->getPosition());
}
 
foreach ($crossSellProducts as $crossSellProduct) {
    $crossSellProductsArranged[$crossSellProduct->getId()] = array('position' => $crossSellProduct->getPosition());
}
 
foreach ($relatedProducts as $relatedProduct) {
    $relatedProductsArranged[$relatedProduct->getId()] = array('position' => $relatedProduct->getPosition());
}

Merge new product data

When migrating products to Magento, products will be created, not updated, so if there is multiple upsell, cross sell or related products, this parameter can be incremented in a loop starting from zero or they can all be set to zero.Merging new upsell, cross sell and related products:

$newUpSellProducts = array($newUpSellProduct1, $newUpSellProduct2);
foreach ($newUpSellProducts as $newUpSellProduct) {
    $upSellProductsArranged[$newUpSellProduct->getId()] = array('position' => '');
}
 
$newCrossSellProducts = array($newCrossSellProduct1, $newCrossSellProduct2);
foreach ($newCrossSellProducts as $newCrossSellProduct) {
    $crossSellProductsArranged[$newCrossSellProduct->getId()] = array('position' => '');
}
 
$newRelatedProducts = array($newRelatedProduct1, $newRelatedProduct2);
foreach ($newRelatedProducts as $newRelatedProduct) {
    $relatedProductsArranged[$newRelatedProduct->getId()] = array('position' => '');
}

When all relations are merged, they should be set as one of product’s _data parameter:

$product->setUpSellLinkData($upSellProductsArranged);
$product->setCrossSellLinkData($crossSellProductsArranged);
$product->setRelatedLinkData($relatedProductsArranged);

Finally the product can be saved:

$product->save();

Database structure

This may seem like a simple thing. All that is needed is to set upsell, cross sell and related product ids in array with their positions and save the product. Backend process is actually complicated. Magento function that handles product relations saving process is saveProductRelations($product). It is located in Mage_Catalog_Model_Product_Link class.

Database structure for product relations is eav structure. Main table, in which most of this information is saved, is “catalog_product_link”. It’s structure is very simple. It consists of 4 columns. “link_id” is increment ID, “product_id” is edited product, “linked_product_id” is ID of the product that is related to edited product, “link_type_id” is relation type ID. 4 is for upsell, 5 for cross sell and 1 for related product. Second table worth mentioning is “catalog_product_link_attribute_int” which saves product’s position parameter mentioned earlier.

 

select * from catalog_product_link;

+---------+------------+-------------------+--------------+
| link_id | product_id | linked_product_id | link_type_id |
+---------+------------+-------------------+--------------+
|     1   |     247    |               640 |            1 |
|     2   |     247    |               642 |            1 |
|     3   |     247    |               647 |            1 |
|     4   |     247    |               641 |            1 |
|     5   |     247    |               652 |            4 |
|     6   |     247    |               651 |            4 |
|     7   |     247    |               651 |            5 |
|     8   |     247    |               652 |            5 |
|     9   |     247    |               652 |            1 |
|    10   |     247    |               651 |            1 |
+---------+------------+-------------------+--------------+

The post Programmatically create upsell, cross sell and related products in Magento appeared first on Inchoo.

Case Study: Migration from Magento 1 to Magento 2 for Sloan Express

Sloan Express is a family-owned business with deep roots in the agricultural industry that have been serving the needs of farmers worldwide for over 80 years. Located in Central Illinois, Sloan Express is the area leader offering new agricultural parts that are equal to or better than the original equipment part. They sell directly to farmers, implement dealers and repair shops.

Sloan Express has been able to address some of the problems that today’s farmers face: parts not stocked locally; shortage of local sources for parts; and most important – TIME. The Sloan Express Customer prompt and direct delivery with no time wasted trying to find a part and then waiting for it to come in.

What were the challenges for us?

Image may be NSFW.
Clik here to view.
Sloanex_new
2017
Image may be NSFW.
Clik here to view.
Sloanex_old
2016

Recognizing the need to take the business to the next level, Jeff Sloan and the team from Sloan Express approached us looking for Magento professionals that can migrate their existing store from 1.7. Open Source edition to Magento 2.

Magento 2 introduces new methodologies and technologies for delivering enhanced shopping and store experience to the merchants and users. But to be honest, migrating from Magento 1 to Magento 2 is not an easy and trouble-free process. Since it’s not automated, there is plenty of manual work that needs to be done by professionals who understand migration process and your business in order to get a stable and fully functional store.

Sloan Express knew what they wanted for their future. They required a solution that can easily scale up if required and has a modular architecture to ensure faster page load time, faster add-to-cart server response time and faster end-to-end checkout time.

Image may be NSFW.
Clik here to view.
Inchoo at Sloan Express

Inchoo at Sloan Express

Magento 2 Open source (previously known as Community Edition) comes with support for only MySQL search engine, but some projects require better or more adjustable search engine in order to increase sales or conversion rate. For Sloan Express, we’ve implemented SOLR search engine in order to achieve blazing-fast search results that are also highly reliable and fault tolerant. With a near real-time indexing, advanced full-text search capabilities and optimisation for high volume traffic, SOLR has brought a new dimension for the customers using the site.

The client wanted to keep all of their existing features and extensions from Magento 1. As one can imagine, Magento 1 extensions are not compatible with Magento 2, so we implemented new ones and set fundamentals for all future technical implementations and integrations as PIM, ERP and other complex technical systems.

Since Sloan Express had a large number of categories, we had to do a major restructuring of the store’s hierarchy, which, among other, resulted with structured navigation and flow that seems more natural to the end user.

The end result

Sloan Express now has modern and clean responsive design with a completely new look and flow that provides optimal viewing and interaction experience across a wide range of devices. We designed it by having in mind business needs for this special niche and eCommerce trends supported with the gathered information and behavior of visitors on the previous store.

Image may be NSFW.
Clik here to view.
Results Sloanex Inchoo

Implementing several analytic tools, we now have a better understanding of the customer’s journey and how they engage with the brand. That gives us the opportunity to continuously test and improve technical functionalities and user experience in order to increase revenue and reflect the quality that stands behind the name of Sloan Express in agriculture world.

The post Case Study: Migration from Magento 1 to Magento 2 for Sloan Express appeared first on Inchoo.

How to keep design library in sync across the team? Welcome Sketch Libraries!

The buzz these days is all about design systems, but design system by itself is not enough to ensure consistency through all designs. When working with design systems, the main challenges are ongoing maintenance and informing everyone about the changes.

For a long time, there wasn’t a thorough solution for designers who design in Sketch which would provide easy access to the latest styles and propagate changed assets to team members. Yeah, we had the ability to share symbols via plugins for a while (Craft’s Library), but there were too many problems, and sharing library is too important to rely on a third-party plugin.

Welcome Sketch Libraries

Sketch just made public the Sketch 47, and we finally have a document with symbols which can be used across other documents, so let’s see how to use libraries in Sketch.

1. Create a Sketch document with at least one symbol and save your document in Dropbox, Box, Sync or any other place where your colleagues have access.

Image may be NSFW.
Clik here to view.

2. Press CMD + comma to open Sketch’s Preferences and navigate to the Libraries tab.

Image may be NSFW.
Clik here to view.

3. You’ll notice there is iOS UI Design library included, but we’ll create a new library. Click on the “Add Library…” button and choose your document. Congrats, you’ve just created a single source of truth for everyone in your team.

Image may be NSFW.
Clik here to view.

Team members can now easily add the Library by following the same steps mentioned above and access the symbols in that file from any Sketch file.

Inserting, editing and accepting changes

Using shared library is simple and straightforward. Inserting symbols works just like inserting regular symbols, the only difference is they are not placed in your document. To insert a symbol just find your shared library at the bottom of the list on the Insert menu.

Image may be NSFW.
Clik here to view.

You’ll notice external symbols have slightly different icons from the local symbols to avoid confusion.

Once inserted, there are two options for editing an external symbol. You can unlink it from Library or open it in the Original document.

Image may be NSFW.
Clik here to view.

If you choose “Unlink from Library”, it will detach from the external library and become a local symbol in your current Sketch file.

Making changes in the original document will affect all instances of the symbol across any document which is using this library, but only if those changes are accepted. After making changes, everyone who is using this library will see “Library Update Available” badge on the top-right corner of Sketch.

Image may be NSFW.
Clik here to view.

Maybe Sketch crew should make that badge more prominent because it’s easy to miss. Anyhow, clicking on it will display a dialog box with outdated symbols and an option to selectively update them.

Image may be NSFW.
Clik here to view.

To sum up…

This feature is definitely a game changer for all Sketch users and it will change real-time collaboration permanently. What we would like to see in some of the future updates is an option to include text styles and layer styles in a library.

The post How to keep design library in sync across the team? Welcome Sketch Libraries! appeared first on Inchoo.

Solr and Magento – search by department

There are eCommerce stores which sell a wide variety of products like food, personal care, electronics, and so on. On those stores, visitors want to be able to search by a specific category. This can be achieved by adding a new feature: search by department or category. In this quick tutorial I will explain the base concept of how to do it using Solr search engine as an example.

Assuming you already use Solr search server on your eCommerce site, first step should be checking if there is a category_ids field in Solr index. Category_ids should exist in Solr index in order to be able to filter by category_id.

To check if the field exists, call this url: http://localhost:8983/solr/collection1/select?q=*%3A*&wt=json&indent=true.

You can see category_ids in the response below:
Image may be NSFW.
Clik here to view.

Field category_ids should be declared in Solr schema.xml as in the following example:

        <field name="category_ids"  type="int"  indexed="true" multiValued="true"/>

Category_ids field is multivalued and indexed, can contain more values, and product can be assigned to more categories.

If the previous conditions are fulfilled, only the Magento side should be modified to send a proper category filter with search phrase. If you use Solarium client, it is pretty easy.

$client = new Solarium\Client($config);
 
// get a select query instance
$query = $client->createSelect();
$query->setQuery('Teflon');
$query->setFields(array('id','name','price'));
 
// create a filterquery by category id 3887
$fq = $query->createFilterQuery('category_ids')->setQuery('category_ids:3887');

If you are interested in finding out more about the Solarium concepts, click here.

Finally, query to Solr should look like this:
http://localhost:8983/solr/collection1/magento_en?q=Teflon&fq=category_ids:3887

q parameter is search terms “Teflon”
fq parameter is filter by category id 3887 (more about Solr common parameters)

The results

This is how a “search by category” on Magento frontend can look like:
Image may be NSFW.
Clik here to view.

With this feature you can improve site search performance and decrease the need for search refinement which, ultimately, has direct impact on your eCommerce conversion rate.

The post Solr and Magento – search by department appeared first on Inchoo.


Add custom image field for custom options

We had a request from a client who wanted to display images for custom options. In this article, I’ll explain how to add the image field to the custom option in admin.

Create new module Inchoo_ProductCustomOptionsFile

app/etc/modules/Inchoo_ProductCustomOptionsFile.xml

  1. <?xml version="1.0"?>
    <config>
        <modules>
            <Inchoo_ProductCustomOptionsFile>
                <active>true</active>
                <codePool>local</codePool>
            </Inchoo_ProductCustomOptionsFile>
        </modules>
    </config>

Configuration file

To add your custom field it is necessary to rewrite class         Mage_Adminhtml_Block_Catalog_Product_Edit_Tab_Options_Type_Select and set your template. To fill custom option image filed with new image’s names it is necessary to rewrite the class Mage_Adminhtml_Block_Catalog_Product_Edit_Tab_Options_Option.

/app/code/local/Inchoo/ProductCustomOptionsFile/etc/config.xml

<?xml version="1.0"?>
<config>
	<modules>
		<Inchoo_ProductCustomOptionsFile>
			<version>1.0.0</version>
		</Inchoo_ProductCustomOptionsFile>
	</modules>
	<global>
		<blocks>
			<adminhtml>
				<rewrite>
					<catalog_product_edit_tab_options_type_select>Inchoo_ProductCustomOptionsFile_Block_Adminhtml_Rewrite_Catalog_Product_Edit_Type_Select</catalog_product_edit_tab_options_type_select>
					<catalog_product_edit_tab_options_option>Inchoo_ProductCustomOptionsFile_Block_Adminhtml_Rewrite_Catalog_Product_Edit_Tab_Options_Option</catalog_product_edit_tab_options_option>
				</rewrite>
			</adminhtml>
			<inchoo_file>
				<class>Inchoo_ProductCustomOptionsFile_Block</class>
			</inchoo_file>
		</blocks>
		<resources>
			<inchoo_productcustomoptionsfile_setup>
				<setup>
					<module>Inchoo_ProductCustomOptionsFile</module>
				</setup>
			</inchoo_productcustomoptionsfile_setup>
		</resources>
	</global>
	<admin>
		<routers>
			<adminhtml>
				<args>
					<modules>
						<inchoo_product before="Mage_Adminhtml">Inchoo_ProductCustomOptionsFile_Adminhtml</inchoo_product>
					</modules>
				</args>
			</adminhtml>
		</routers>
	</admin>
</config>

 Blocks

/app/code/local/Inchoo/ProductCustomOptionsFile/Block/Adminhtml/Rewrite/Catalog/Product/Edit/Type/Select.php

We want to use our template instead default one for custom option.

<?php
class Inchoo_ProductCustomOptionsFile_Block_Adminhtml_Rewrite_Catalog_Product_Edit_Type_Select extends Mage_Adminhtml_Block_Catalog_Product_Edit_Tab_Options_Type_Select
{
	public function __construct()
	{
		parent::__construct();
		$this->setTemplate('catalog/product/edit/options/type/select-with-file.phtml');
		$this->setCanEditPrice(true);
		$this->setCanReadPrice(true);
	}
 
}

Create a template file and copy content of /app/design/adminhtml/default/default/template/catalog/product/edit/options/type/select.phtml into our new template file /app/design/adminhtml/default/default/template/catalog/product/edit/options/type/select-with-file.phtml

Add code between Inchoo into select-with-file.phtml template

<!---...-->
 
OptionTemplateSelect =
<!---...-->
        '<th class="type-sku"><?php echo Mage::helper('core')->jsQuoteEscape(Mage::helper('catalog')->__('SKU')) ?></th>'+
            // Inchoo
        '<th class="type-title"><?php echo Mage::helper('core')->jsQuoteEscape(Mage::helper('catalog')->__('Image Name')) ?></th>'+
        '<th class="type-title"><?php echo Mage::helper('core')->jsQuoteEscape(Mage::helper('catalog')->__('Upload New Image')) ?></th>'+
            // Inchoo
        '<th class="type-title"><?php echo Mage::helper('core')->jsQuoteEscape(Mage::helper('catalog')->__('Sort Order')) ?></th>'+
<!---...-->
 
OptionTemplateSelectRow =
<!---...-->
        '<td><input type="text" class="input-text" name="product[options][{{id}}][values][{{select_id}}][sku]" value="{{sku}}"></td>'+
            // Inchoo
        '<td><input type="text" class="select-type-image" id="product_option_{{id}}_select_{{select_id}}_image" name="product[options][{{id}}][values][{{select_id}}][image]" value="{{image}}">{{checkboxScopeTitle}}</td>'+
        '<td><input type="file" class="input-text select-type-image" id="image" name="{{id}}-{{select_id}}"></td>'+
            // Inchoo
        '<td><input type="text" class="validate-zero-or-greater input-text" name="product[options][{{id}}][values][{{select_id}}][sort_order]" value="{{sort_order}}"></td>'+
<!---...-->

To pass our new custom option value to template we need to rewrite Mage_Adminhtml_Block_Catalog_Product_Edit_Tab_Options_Option class. Create block /app/code/local/Inchoo/ProductCustomOptionsFile/Block/Adminhtml/Rewrite/Catalog/Product/Edit/Tab/Options/Option.php Add code between Inchoo comment. Function getOptionValues() returns all product custom options to javascript object which fills custom options fields.

class Inchoo_ProductCustomOptionsFile_Block_Adminhtml_Rewrite_Catalog_Product_Edit_Tab_Options_Option extends Mage_Adminhtml_Block_Catalog_Product_Edit_Tab_Options_Option
{
	public function getOptionValues()
	{
 
// ...
	$i = 0;
	$itemCount = 0;
	foreach ($option->getValues() as $_value) {
	/* @var $_value Mage_Catalog_Model_Product_Option_Value */
		$value['optionValues'][$i] = array(
		'item_count' => max($itemCount, $_value->getOptionTypeId()),
		'option_id' => $_value->getOptionId(),
		'option_type_id' => $_value->getOptionTypeId(),
		'title' => $this->escapeHtml($_value->getTitle()),
		'price' => ($showPrice)
		? $this->getPriceValue($_value->getPrice(), $_value->getPriceType()) : '',
		'price_type' => ($showPrice) ? $_value->getPriceType() : 0,
		'sku' => $this->escapeHtml($_value->getSku()),
		'sort_order' => $_value->getSortOrder(),
		// Inchoo
		'image' => $_value->getImage(),
		// Inchoo
		);
// ...

Now we have our custom fields Image Name and Upload New Images :

Image may be NSFW.
Clik here to view.

Saving images

To save images into database, we need to create setup script and rewrite Mage_Adminhtml_Catalog_ProductController. Then create setup script /app/code/local/Inchoo/ProductCustomOptionsFile/sql/inchoo_productcustomoptionsfile_setup/install-1.0.0.php

<?php
/* @var $installer Mage_Core_Model_Resource_Setup */
 
$installer = $this;
 
$installer->getConnection()
->addColumn($installer->getTable('catalog/product_option_type_value'), 'image', 'VARCHAR(255) NULL');
$installer->endSetup();

Rewrite class Mage_Adminhtml_Catalog_ProductController and add code between Inchoo to save images name into table catalog_product_option_type_value and upload images.

 

<?php
require_once(Mage::getModuleDir('controllers','Mage_Adminhtml').DS.'Catalog'.DS.'ProductController.php');
 
class Inchoo_ProductCustomOptionsFile_Adminhtml_Catalog_ProductController extends Mage_Adminhtml_Catalog_ProductController
{
	/**
	 * Initialize product before saving
	 */
	protected function _initProductSave()
	{
 
// ..
 
		$product->setCanSaveConfigurableAttributes(
			(bool) $this->getRequest()->getPost('affect_configurable_product_attributes')
			&& !$product->getConfigurableReadonly()
		);
// Inchoo
		$skuImageName=trim($product->getSku());
		$imagesFiles=$_FILES;
		$path = Mage::getBaseDir('media') . DS . 'catalog' . DS . 'customoption' .DS. 'images';
 
		foreach ($imagesFiles as $key=>$value)
		{
			$optionsValue = explode('-',$key);
			foreach ($value as $key2=>$value2)
			{
				if($key2=='name' && $value2!="") {
					try {
						$uploader = new Varien_File_Uploader($key);
						$uploader->setAllowedExtensions(array('jpg','jpeg','gif','png','svg'));
						$uploader->setAllowRenameFiles(false);
						$uploader->setFilesDispersion(false);
						$optionTitle = trim($productData['options'][$optionsValue[0]]['title']);
						$optionValueTitle = trim($productData['options'][$optionsValue[0]]['values'][$optionsValue[1]]['title']);
						$imageExtension = pathinfo($value2,PATHINFO_EXTENSION);
						$newImageName =$skuImageName.'_'.$optionTitle.'_'.$optionValueTitle.'.'.$imageExtension;
						$uploader->save($path, $newImageName);
						$productData['options'][$optionsValue[0]]['values'][$optionsValue[1]]['image']=$uploader->getUploadedFileName();
					} catch(Exception $e) {
						Mage::log('Unable to save custom option image. ' . $e->getMessage(), null, null, true);
					}
 
				}
			}
		}
// Inchoo

And that’s it. If you have any questions, feel free to post them in comments.

The post Add custom image field for custom options appeared first on Inchoo.

Keeping an eye on small things in eCommerce projects

When managing projects, one usually focuses on big things: biggest costs, biggest features, biggest risks, etc. The same is with building an eCommerce site – the biggest, most important things are, well, most important.

But large topics are not the be-all and end-all of the project.

In this post, we will illustrate how tiny issues can have an outsized influence on the project. Through analysis of several examples from our experience, we will try to understand how small perturbations shape the course of the project.

So let’s start with something totally relevant, let’s start with – Napoleon.

Image may be NSFW.
Clik here to view.

We all know the story. He conquered Europe, and then turned his sights on Russia. His conquest started with an army of 680 000 men, and in a short time marched into Moscow. Russians evacuate Moscow and burn down three quarters of it! Napoleon, short on supplies retreats from the empty city. During the retreat, he loses most of his army. Out of 600 000 men, only 30 000 survive, and only 1 000 are fit for service.

A great story of brilliant defensive strategy.

Or is it?

In 2001, workers digging trenches for telephone cables in Vilnius, Lithuania found over 2 000 skeletons, stacked three on top of each other, arranged in v-shape. Analysis revealed that those were soldiers from Napoleon’s time, and with their remains, the complete story of Napoleon’s defeat was revealed:

Napoleon starts his attack on Russia in June 1812 from Germany. Things turn downhill in Poland. The summer is unusually hot and 20 thousand horses die of thirst. This stretches and strains the supply lines and the resources are scarce. The hygiene is bad and lice are becoming a huge problem – they are visibly crawling on men and the whole army is infested. Lice carry typhus, and in a month 80 000 soldiers die or are incapacitated because of the disease.

In August Napoleon conquers Smolensk, but another 105 00 men are lost to typhus. In the next two weeks, typhus claims another 60 000 men. In September, a week after the battle of Borodino, Napoleon enters Moscow with 90 000 men. The city is deserted and burned. Napoleon requests enforcements, but out of 15 000 men sent to him, 10 000 die from typhus. The winter is coming, and supplies have run out. Napoleon decides to retreat.

In December they come to Vilnius, with only 20 000 men fit for service. Fearing the coup, Napoleon urgently leaves for Paris, and general Murat organizes the retreat.

And now it is clear: a military genius of Kutuzov, and the coldest Russian winter and all the Russian cannons heard pounding in overture to Tchaikovsky’s 1812 weren’t enough to defeat the great general. It took the smallest creature of them all, unassuming, ordinary lice to bring destruction to Grande Armée.

With that in mind, what are the small, seemingly insignificant things that have an outsized influence on our projects?

Let’s walk through few examples of the small challenges or hiccups that proved to be big in the end (whether we acted proactively to resolve them and prevent the excrement hitting the fan, or we learned the hard way from addressing them too late in the game).

We will start with technical topics, and move slowly into organizational and pure project management issues.

Image may be NSFW.
Clik here to view.

Comment? What comment?
– When comments in code are bad

You know how developers often comment out pieces of code while working, in order to speed up a process or go around specific bugs in Magento (what bugs in Magento, you ask?!?!)

Well, it happened probably too many times that a piece of code that was commented out during development made it through to production that way, causing havoc on live site. Why is this commented out? What does it do? What did it use to do?

Lesson learned: Establish proper git branching model – And stick to it for dear life!

Leave commented out code out of committed code. Instead, keep it in git-diff – this may be subject to interpretation and depends on the project at hand, but this is a good rule of thumb. For controlling different versions of our code, we should use – version control software.

Attack of the Robots
– Ignore crawlers at your own peril!

Magento site works the same with or without the robots.txt file. The file itself does not affect the functioning of the store. The interesting thing about it is that it is one of few items that must be different between staging and live site (on staging we want to exclude everything from crawling, and on live site we are targeting specific files and folders – though usually it is a large number of them).

What happens is that after deployment, because of gitignore developer inadvertently deletes robots.txt file without noticing. Actually, nobody notices – the site works ok and everybody is celebrating new feature release.

Nobody notices, until we got a call from the client that the site is down. Next call is with the hosting provider – they will yell how they are bombarded with requests and how Google crashes our site. All because of one simple small txt file that does not affect our code or the user.

Lesson learned: Establish post-deploy check procedure – And execute it every time!

There are a lot of small tasks that need to be done and are easily forgotten during deploy. Having a checklist that we can rely on releases our mental energy to focus on executing the task at hand.

Image may be NSFW.
Clik here to view.

I-Track, U-Track, DDoS-Track
– Newsletters gone wild!

Newsletters are a great way to inform customers about new promotions and discounts. One large client had a big subscriber database and was sending a newsletter to them without problem for years. The URL in the newsletter was a hefty complex query, but as Magento has full-page caching, it was not a problem for the server.

At one point marketing came to a very reasonable idea to track which subscribers open the newsletter, so they could analyze and segment the campaigns. The tracking was implemented by adding a unique ID parameter to each newsletter URL, so when the user opens the newsletter, the hit is registered.

What that meant is that each user was served a unique page, Magento’s full-page cache was no longer used, and the query was executed for each hit. With a huge amount of subscribers opening their newsletter at the same time and thus executing the enormous query, the site went down. The client DDoS-ed themselves.

Lesson learned: Establish a Change Request Procedure – Make it simple, clear and stick to it!

Change is a fact for any project, that is why we have to be prepared for change. By establishing a Change Request Procedure we will make sure that relevant persons vet the change. For example, should the request be checked by Development, SEO specialist, Marketing? In our example, the solution was to setup caching to ignore the parameter – something a developer would notice in a second.

Extending the non-extendable
– Assumptions are just that!

We have a relatively large client (around 80 000 products) that wanted to track stock for a subset of products, and if a customer tries to order out of stock product, offer them a subscription to notification when the product is in stock again.

The notification is easy to do if the stock management is global, if Magento manages the stock for all products. If the stock is tracked only on a subset of products, as was the case with that client, then Magento will trigger the subscription message for all non-managed products (since their stock is 0)

We planned to go with Magento 1, so the solution was simple – extend the core to additionally check does the product have managed stock and if not, suppress the message. 30 minutes work.

Then Magento 2 became a thing. And the plan was changed, we’ll go with Magento 2. Because why not. Now, Magento 2 does not allow extending core. In general, you don’t extend a component, you write your own. In this case, we were deep in the core, we had to rework the whole Product page, which landed us with 30h of work. Add few similar customizations, and you have a project deep in red.

Lesson learned: Check your assumptions – Then double check them!

Every project is built on a series of assumptions. If one of them changes or proves wrong (we’re using Magento 2 instead of Magento 1), then the project is in danger (in this case, our estimates are way off). But if we know the assumptions we’re building our project on (dare I say: if we have them written down), then we can always check them and react when they’re changing.
Most important lesson: build a relationship of trust with the client, so you can work together when problems like these arise.

An admin, an admin! My kingdom for an admin!
– Who handles system administration?

We had a client where we took care of everything on Magento side, and everyone thought (ok, we thought) that their hosting company takes care of permissions and overall server-side setup.

However, when we came close to deploying to the live site (there was a specific environment in place where we were allowed to push changes to live in a very specific way), we brought the site down only to realize that our latest deployments failed to go to production due to server-side setup mixup. We were not in a position to fix this ourselves, hosting company wasn’t aware we expected them to handle this, the client didn’t know whose job this would have to be in the first place. A small omission in communication caused a huge issue when the push came to shove.

Lesson learned: Know your scope – In more details than you think you need

Ask questions, a lot of them. Prepare a checklist for the Sales team and on-boarding process. PMs should make sure to have all of this information in place before the team starts working.

Roses are red, violets are blue
– Clash of personalities

When a new project kicks off, you have to establish a good rapport with a client. Sometimes, when the deadlines are tight and you are already stretched, you don’t have enough time to think about personality match with the client. Things can go south quickly if there is a personality or communication style mismatch that you put under the rug hoping it will sort itself out.

We’ve had a scenario in which our lead developer, who acted as the main point of contact, and the client had almost an outing over a specific task where they poorly communicated the feature request. The confusion, combined with poor judgment on how one can/should communicate directly with a client created enough bad blood to force us to remove the lead developer from all communication with the client and get another team member assume this role.

Fortunately, the timing was not off, and we made it to a point where we managed to salvage the relationship and the project.

Lesson learned: Know your client – Adapt your communication to the client

Keep a close eye on the communication, invest time to learn who you are talking to on the other side, who is the client’s representative, what communication style they prefer, how technical are they… so that you can decide what communication tone and frequency would be the best from your end. In PM-speak, do a stakeholder analysis and use it as input for creating communication matrix for the project.

Six is not enough?

If the six examples did not convince you, we could spend hours talking about:

  • Client thinking something is irrelevant or easy and forgetting to tell us until it’s too late. E.g. automatic order processing is “just one button”, or the client doesn’t mention they have multiple stores,
  • The time we did not contract 3rd party support (for integrated systems),
  • That project where we did not include all stakeholders from the client side (e.g. not all departments) which led to last minute changes and budget overruns,
  • When we did not explain our process to the client, so they were late with their deliveries (e.g. logos, transactional emails),
  • The client that was too detailed and wanted to have spit-polished plans – the whole budget was spent only on planning,
  • Or many projects where we did not check if extensions really work with the latest Magento version (e.g. M1 -> M2 upgrade)
  • Or …

Instead of conclusion

“By failing to prepare, you are preparing to fail.”
– Benjamin Franklin

The six examples we covered are ranging from technical to social, but they have one thing in common. The solution was not technical, it was better processes, communication, organization – in short, better Project Management. They are the best illustration that the biggest risk and the biggest opportunity in projects come from good or bad project management. Please share your experiences in comments – what were the small things that made or broke your projects, and how did you deal with them?

Oh, I almost forgot…

Those skeletons in Vilnius? That wasn’t a mass grave, they were not burying the bodies. The ground was too frozen to dig, so they could not dig trenches. They used frozen corpses of their friends to build a breastwork, a shield, barricades to protect them from advancing Russian army.

Don’t let that happen to you because you ignored small things.

The post Keeping an eye on small things in eCommerce projects appeared first on Inchoo.

Why you should go to MageTestFest if you are a Magento developer

When a conference is advertised as “Magento. Software Testing. Party.” and Yireo stands behinds its organization, there is really not much left to say in order to convince you to attend MageTestFest that’s happening November 15th – 18th 2017!

There are no doubts – you must clear your schedule and head off to this international developer oriented event that has one single focus: TESTING! Vital for clean coding, testing is a topic that can always break the unbearable silence among developers that are just meeting each other.

But let’s be serious for a moment here. Testing should be a fundamental part of your work if you are a developer. Guys behind MageTestFest saw the need for full-depth conference about testing, where everybody is talking about the same topic, which results in extreme focus, learning and fun. And also for future cleaner code, fewer screw-ups and happier customers.

Who are the speakers?

Only proven experts on the subject! Sebastian Bergmann, Mathias Verraes, Vinai Koop, Fabian Schmengler, James Cowie, Tom Erskine, Igor Minailo and Jisse Reitsma will blow you away with knowledge they are willing to share in order for you to become an even better developer!

 

Four exciting days are ahead of us all – 1 conference day, 2 workshops (PHPUNIT, DDD) and 1 contribution day/hackathon.

Meet the Inchooers, they’ll have the goodies!

Inchoo troops will also be there! Look for our green banner where Stjepan Udovicic and Luka Rajcevic will meet you with open hands (that will be filled with Inchoo swag)! T-shirts, notebooks, stickers, be sure to grab yours while you can! 😉


Go to MageTestFest, invest in a future with fewer screw-ups!

The post Why you should go to MageTestFest if you are a Magento developer appeared first on Inchoo.

How to delete spam customer accounts from Magento

We all love spam bots, don’t we? They really help us improve our sites. There was a situation with one of our clients being hit by a spam bot that generated dozens of customer accounts daily. Let us show you how to get rid of them and protect the site against future attacks.

How the problem started

One would ask:  “Why there wasn’t any validation on register form?”. Well, things were functioning smoothly for a couple of years, so there wasn’t need for one. It was happening for a few weeks until it was recognized. So, we found ourselves in situation where spam customer accounts had to be recognized, deleted and prevented from registering again.

Detailed examination

With detailed examination of customer grid, it was immediately clear this wasn’t going to be easy. There were many accounts with different names and email addresses. At the time, nearly 30000 accounts were registered. Going through the list and delete them manually was not an option. It would take too much time to open each account, examine it and decide whether it’s a spam account or a real customer. It had to be done with the script.

Recognizing possible patterns

There is no simple way of properly recognizing spam customer accounts. In order to delete them programmatically, you have to be sure you’re not going to delete a real customer. It would be very unpleasant situation for a customer to be deleted from the website. Not only the customer would be unhappy, but all connections to his/her orders would be lost.

 

So, establishing a way to recognize only spam customer accounts would be in a few steps.

1. Go through a reasonable number of spam accounts and write down most repeating similarities between them.

In this case, those would be following:

– One or two capital letters at the end of a firstname or lastname, rest of the letters are small

– all email domains ending with *.ru, *.xyz, *.ua, *.top

– Numbers in firstname or lastname

– identical first and lastname

 

Account examples:

onlinecreditufedor@qip.ru, FedorKr FedorKroM

moiseevayeq1957@mail.ru, MichaelWoxeF MichaelWoxeFLP

abellayssard@homail.top, Assingnits AssingnitsDV

meme123@ccxpnthu2.pw, Ronaldtrek RonaldtrekWI

lesha.gorodnitsyn@mail.ru, VladimirCrOp VladimirCrOpAN

maksim.sakevich@yandex.ua, DouglasPhem DouglasPhemDU

ahtd95782@gmail.com, WalterDer WalterDerXV

mretsan@mail.ru, Simfum Алексей

kuch@vitalityspace.com, MartinRoot MartinRootWK

georgina14@dlfiles.ru, zirehohamew79 Taylor

ra.um@mail.ru, Somfum Димас

gfhherejft@mail.ru, FrankieDok Bartek

srhcgiarc@007game.ru, top2017bloomingme Beson

teod.or78@mail.ru, GlennZek Vlad Stahov

admin_3@iphone-ipad-mac.xyz, Xewrtyuoipye XewrtyuoipyeBP

abcd2775y38@nod03.ru, myregobahev87 Alejandro

akilaanka@qip.ru, aseoprodwig aseoprodwig

polysten@i.ua, CharlesSCARK CharlesSCARKJE

 

Magento code is as follows:

$customers = Mage::getModel('customer/customer')
    ->getCollection()
    ->addAttributeToSelect('*')
    ->addAttributeToFilter(
        array(
            array('attribute' => 'email', 'like' => '%.ru'),
            array('attribute' => 'lastname', 'regexp' => '[a-z][A-Z]{2}'),
            array('attribute' => 'firstname', 'regexp' => '[0-9]'),
            array('attribute' => 'lastname', 'regexp' => '[0-9]')
        )
);

 

2. Try to load addresses for each account

First step should do the trick. But, to be sure that no real data will be lost from Magento, this additional step will be applied. This particular spam bot was unable to register and login to the site. It only created number of accounts. So, all of those accounts didn’t have any address associated. If there is any account with address, it shall be skipped from deleting.

foreach ($customers as $customer) {
    $customerAddresses = $customer->getAddresses();
        if ($customerAddresses) {
            continue;
        }
}

 

3. Check if there are any orders for each account.

If there is an order associated to a customer account it shall also be skipped from deleting. This is most probably the real account.

foreach ($customers as $customer) {
    $customerOrders = Mage::getModel('sales/order')
        ->getCollection()
        ->addAttributeToFilter('customer_id', $customer->getId())
        ->load();
 
        if ($customerOrders->count()) {
            continue;
        }
}

In this particular case, filters had to be very carefully set because there are real customers on the site whose names are written in capital letters. They also don’t have any address registered, therefore not having any orders either.

Before deleting customer account, it is nice to have it written in a log file. Just in case.

After all checks have been made, spam customer accounts can be deleted simply by calling $customer->delete() function in a loop.

Prevention

Most of the spam bots will be filtered out by activating Magento’s built in captcha for register form. It can be easily activated in administration under Settings->Customer->Customer Configuration->CAPTCHA. There are several options, as well as forms to be activated on.

As a custom solution and probably the best protection available, a Google’s reCAPTCHA can be implemented on register form. No bots shall pass then.
Image may be NSFW.
Clik here to view.

Conclusion

There are number of different spam bots out there, so there is no simple and certain way of deleting accounts from the website once they are registered. They must be examined manually and pattern shall be defined accordingly. There is no need to cover all of them. It’s impossible. After majority has been deleted, rest of the spam accounts are not so difficult to delete manually.

The post How to delete spam customer accounts from Magento appeared first on Inchoo.

How we developed a new Morris 4×4 Center store – case study

In 2017, we had a challenging task in front of us. It all started with an inquiry from a potential new client, Morris 4×4 Center, that was looking for a partner to help them address and solve website stability issues.

From that, it developed into a whole new project with the deadline of delivering a new site in under 2 months! Backend and frontend developers, design team, eCommerce consultants – we were all in it!

Let me tell you more about this project.

About Morris 4×4 Center

Morris 4×4 Center is a leading eCommerce destination for Jeep and 4×4 enthusiasts looking to outfit and enhance their vehicles. It provides more than 40,000 products across top brands, with passionate experts and a commitment to great customer experience.

Having fulfilled over a million orders in just over 25 years, Morris 4×4 Center’s passionate experts, superior customer service team, and new customer-centric initiatives are poised to better serve and help Jeep and off-road outdoor enthusiast fulfil their dream of a great driving experience.

Image may be NSFW.
Clik here to view.
New design Morris 4x4 Center
New design
Image may be NSFW.
Clik here to view.
Old design Morris 4x4 Center
Old design

What was the challenge in all of this?

Morris 4×4 Center is well known among their targeted audience, and the business itself has outgrown the platform capabilities and could not rapidly implement changes to scale the business further.

Much of the problems stemmed from an outdated eCommerce platform. We proved our expertise by solving technical issues, which then expanded our cooperation through conducting a complete technical audit for their digital store. Detailed report on the findings and actionable set of recommendations we provided set a clear path towards speeding up and improving the store’s stability.

The technical audit was the pebble that started the avalanche. After our initial engagement, Inchoo was invited to bid on an eCommerce replatform project and we were chosen as the development partner for this larger project.

While it was well below zero in Inchoo’s hometown of Osijek, our team flew to Morris 4×4 Center’s headquarters, just outside Miami, Florida, and sealed the deal that we’ll develop a new store in less than 2 months. And we did just that!

How did the development process look like?

Planning, as the initial phase, was crucial in meeting the project deadline. The digital store was originally a hybrid between a static site and Magento, that made things even more challenging. We migrated it to Magento Enterprise 1.14.2. that brought stability, architectural control, and almost painless connection to external systems (such as ERP, Middleware systems and PIM).

Wireframing got us to the point where we identified the key elements to enhance visually and keep an eye on when developing them from the technical side. Clean design was crucial to achieve in displaying complex information architecture in the menu, and we developed it in collaboration with Prototyp. Special attention was devoted to imagery and media center for how-to videos, which helps in keeping Morris 4×4 Center as a recognizable brand in their industry.
 
Part Finder module brought an even better user flow for visitors to reach specific car model and its parts. Based on brand/model/year type of selection, it leads to results page that then offers numerous specific attributes by which to filter further.

 
Diagrams are an icing on the cake for finding the exact name and product for replacement car parts the user is looking for. Functioning as a click on picture where customer clicks on number and gets the exact name of associated products on shop that match the results. Useful feature for this kind of business, where even the mechanics sometimes don’t know the exact name of the particular Jeep part.

What are the results?

Morris 4×4 center is now not only a product destination, but also a content destination for off-road enthusiasts. Media focused responsive design with great information architecture is now more intuitive to use.

Image may be NSFW.
Clik here to view.

Loyalty program and promotion options give a great value not only to customers but also to store owners. With implemented tracking options, it is now easier to analyze site’s usage, test numerous promotional options and to make the most of gained information for remarketing purposes.

If you’re curious to learn what we did for some of our other clients, visit our portfolio page!

The post How we developed a new Morris 4×4 Center store – case study appeared first on Inchoo.

How to improve usability for Magento 2 add to cart process

Magento 2 introduced a new and improved way for adding products to the cart. The system now offers complete asynchronous (ajax) process, although the process itself is not activated in default application state. It requires some manual adjustments in the script call inside the templates. Someone would think, ok, we will simply modify the template script calls and that’s it! We successfully improved the process. Unfortunately, not so fast! There is still more room left for improvement. If you’re interested, let’s find out what’s left on the table.

Introduction

Although Magento ajax add to cart process offers improved experience and usability, (vast majority of users gets frustrated by continuous, back to back page reloading after adding each product to the cart) Magento implementation only gets the job done in core process of simply transmitting the data and showing either success, or the error message. While this certainly is a solid foundation, we’re still witnessing some usability issues along the way. For example, let’s put ourselves in customer perspective for a bit. We’re browsing the store and are currently located on the product page and ready to fire CTA button to add our desired product to the cart. We scrolled a little bit down the page, probably to check product specifications. Button is visible, but the top of the page and header minicart are not. We’re triggering the button. When the process starts, button changes its default state form “add to cart” to “adding” and when the process is finalized, changes the state to “added” before it returns to the default state. That’s the only indication from our current position on the page our product is actually added.

Someone would say, ok, it’s enough. Don’t forget we’re developers after all, we’re classified as advanced users. Not every customer is advanced user who instantly recognizes things and processes along the way. So, where is the actual problem? There is no problem if you’re on top of the page and after triggering the add to cart process, you’re able to see header minicart has changed while indicating updated number of products currently added, and system messages are then triggered to inform you your cart has been updated and product is therefore successfully added to the cart. But, if you are not on top of the page, there is no way you can see, neither the message, or the minicart.

This situation left us with the plenty of room for improvement. How can we improve the process? Simple enough. Once the ajax process is complete, we can trigger scroll event to scroll on top of the page and as soon as minicart get’s updated we can then trigger minicart UI dialog to open and show us the complete experience and what minicart is actually offering to us (either to go to the cart, go to the checkout or continue shopping).

In this post, I will demonstrate one possible way to do it.

Enabling ajax add to cart process

First thing first, we need to get familiar with the architecture behind.

In [magento2_root_dir]/vendor/magento/module_catalog/view/frontend/templates/product/view/add-to-cart.phtml there is a script call for addtocart.js component. What we’re seeing is its “bindSubmit” method is set to false. First step is to actually change it to true and therefore enable ajax add to cart process. Override this file in your theme.

<script type="text/x-magento-init">
    {
        "#product_addtocart_form": {
            "catalogAddToCart": {
                "bindSubmit": false // change it to true
            }
        }
    }
</script>

Ok, we have successfully turned on ajax add to cart process. What’s next? Form the script initialization snippet inside the template, we have valuable information what script is actually responsible for the process. Script alias catalogAddToCart is actually pointing to [magento2_root_dir]/vendor/magento/module_catalog/view/frontend/web/js/catalog-add-to-cart.js javascript component. We will not modify the script itself, we will go one step further and extend the ajaxSubmit method so we can do our custom modification in a separate place and leave the rest of the process clean as much as possible. The procedure requires specific knowledge about jQuery UI widget factory in order to be comfortable with the approach.

Extending widget responsible for establishing the ajax call

Main goal is to extend $.mage.catalogAddToCart widget and modify it’s ajaxSubmit method. If you’re not familiar with UI factory widget extending approach, I’m suggesting getting familiar with UI widget factory in general prior continuing. I won’t go into specific details about core extending procedures, you can get familiar either on jQuery documentation site here or checking out my previous article digging more deeply into the matter.

We will create new require-js script component in our custom theme and call catalog-add-to-cart script as dependency using require-js define module procedure.

First, create script in [magento2_root_dir]/app/design/frontend/[your_vendor_dir]/[your_theme_dir]/web/js/ directory. For the sake of this tutorial, we will call the script inchoo-ajax-cart.js Script needs to be defined as a require js module. Inside the script, we will extend the $.mage.catalogAddToCart and override ajaxSubmit method adding scroll animation to scroll to the top of the page once the ajax call is completed. Simple copy the whole ajaxSubmit method from the original file. Following code is demonstrating how to do it.

/**
 * Copyright © 2013-2017 Magento, Inc. All rights reserved.
 * See COPYING.txt for license details.
 */
define([
    'jquery',
    'mage/translate',
    'jquery/ui',
    'Magento_Catalog/js/catalog-add-to-cart'
], function($, $t) {
    "use strict";
    $.widget('inchoo.ajax', $.mage.catalogAddToCart, {
 
        ajaxSubmit: function(form) {
            var self = this;
            $(self.options.minicartSelector).trigger('contentLoading');
            self.disableAddToCartButton(form);
 
            // setting global variable required for customized ajax cart process
 
            window.ajaxCartTransport = false;
 
            $.ajax({
                url: form.attr('action'),
                data: form.serialize(),
                type: 'post',
                dataType: 'json',
                beforeSend: function() {
                    if (self.isLoaderEnabled()) {
                        $('body').trigger(self.options.processStart);
                    }
                },
                success: function(res) {
                    if (self.isLoaderEnabled()) {
                        $('body').trigger(self.options.processStop);
                    }
 
                    if (res.backUrl) {
                        window.location = res.backUrl;
                        return;
                    }
                    if (res.messages) {
                        $(self.options.messagesSelector).html(res.messages);
                    }
                    if (res.minicart) {
                        $(self.options.minicartSelector).replaceWith(res.minicart);
                        $(self.options.minicartSelector).trigger('contentUpdated');
                    }
                    if (res.product && res.product.statusText) {
                        $(self.options.productStatusSelector)
                            .removeClass('available')
                            .addClass('unavailable')
                            .find('span')
                            .html(res.product.statusText);
                    }
                    self.enableAddToCartButton(form);
 
                    // animate scrolling to the top of the page 
 
                    $("html, body").animate({ scrollTop: 0 }, 1000, function() {});
 
                    // changing global variable value to true (this flag enables communication with update minicart logic)
 
                    window.ajaxCartTransport = true;
                }
            });
        }
    });
 
    return $.inchoo.ajax;
});

Map newly created custom js component in requirejs-config

Second step is to map the script in requirejs-config.js file in the root of your theme directory.

var config = {
    map: {
        "*": {
            "custom-ajaxcart": "js/inchoo-ajax-cart"
        }
    }
};

Don’t forget to check if your script is actually loaded, you can trace it in the network panel of your debugger of choice.

Initialize new custom js component in template file

The third step is to actually get back to our template where the original script is initialized and initialize our custom script instead of the default one. Remember we’re still loading default script as a dependency.

<script type="text/x-magento-init">
    {
        "#product_addtocart_form": {
            // we need to call our script by alias previously mapped in requirejs-config.js file
            "custom-ajaxcart": {
                "bindSubmit": true
            }
        }
    }
</script>

Now go back to our custom script file, what you can see is we’re relying on success method, once we get status 200 OK from the server. We’re than animating the scroll event to the top of the page using jQuery. You can use whatever approach you like but since jQuery is already included in Magento, I have used it.

Maybe you’re now questioning yourself, why there is no anything else in the code except scroll to top method? Where is our code logic responsible for opening the minicart dialog located? The answer is, there is no code! Why? The reason is simple, we’re still not getting anything more here besides the indication post request is actually completed. Magento fires another/separate call to populate the minicart with the updated state. We need to be aware of this call, and we need to open cart dialog once the minicart is updated with the new data.

We don’t want to expand the minicart UI dialog immediately from here because the data is still not updated. Best time to open the dialog is as soon as the minicart gets updated and populated with the new data. Since those two processes are not connected on the frontend, we need to set some sort of a flag so we know when to trigger our cart dialog opening state. You will see I have created ajaxCartTransport variable on top of the file and assigned the variable to the window object, making it global and always accessible. This flag will connect our process further on. Once the ajax success and scroll process is completed, you will see we’re forcing variable value change to true. This is the flag we’re using to trigger cart dialog opening in our next step.

Ok, now we know where is our ajax call, next step is to find the logic responsible for updating the minicart with the fresh data.

Modify js component responsible for updating the minicart data

The logic lies in [magento2_root_dir]/vendor/magento/module-checkout/view/frontend/web/js/view/minicart.js file. Since this file is a mixture of UI component (knockout) and separate logic I suggest you just copy and override the whole script in your theme.

Now, we need to locate where is the actual place where data gets updated. It’s inside UI component update method. Inside the update method, we will add our small logic to expand the minicart dropdown dialog. Here is a quick example of how to do it.

 /**
         * Update mini shopping cart content.
         *
         * @param {Object} updatedCart
         * @returns void
         */
        update: function (updatedCart) {
            _.each(updatedCart, function (value, key) {
                if (!this.cart.hasOwnProperty(key)) {
                    this.cart[key] = ko.observable();
                }
                this.cart[key](value);
 
                // our logic for opening the minicart
 
                if(window.ajaxCartTransport == true) {
 
                    // finding the minicart wrapper element
 
                    var minicart = $('[data-block="minicart"]');
 
                    // finding the dropdown element itself and invoking dropdownDialog open method 
 
                    minicart.find('[data-role="dropdownDialog"]').dropdownDialog('open');
 
                    // setting our custom global variable immediately back to false
 
                    window.ajaxCartTransport = false;
                }
            }, this);
        },

What we did here? First, we checked our flag to make sure we will expand our dialog only in case ajax call is triggered shortly before. We will check if the global ajaxCartTransport variable is set to true, if it is, we will open up the minicart dialog. We will open it in best practice manner, invoking ui .dropdowndialog(‘open’) method. JQuery widget factory provides a nice feature, widget methods invocation. We need to make sure to set our flag back to false immediately after to prevent errors from opening minicart on each other update process that is not connected to our ajax add to cart custom process.

Conclusion

And here we are, we have successfully improved our ajax add to cart process. Now the customer is instantly aware no matter on which position of the page is currently located. We have extended ajax cart process and added our scroll to top animation, waited for the minicart to update the data and opened the ui dropdownDialog once the procedure is completed and the data is fully updated.

Maybe there is a better/cleaner way to handle the issue, if you’re eager for more and strive for an even more sophisticated solution, drop the note in the comments section. I will be happy to discuss it further.

Happy coding! 🙂

The post How to improve usability for Magento 2 add to cart process appeared first on Inchoo.

Using PostCSS with Sass

In this article we’ll be looking to a basic overview of PostCSS from the perspective of a developer whose current CSS development process includes use of CSS preprocessor, or in particular, Sass. If you’re a Sass user, there are a couple of approaches when starting out with PostCSS. You could make a complete switch and recreate your basic preprocessing environment with PostCSS, or you could start using it as a supplement.

Many of you will say that you still only rely on your favorite preprocessor, but then, it’s possible that you’re also using Autoprefixer for vendor prefixing, and guess what? In this case, you have already included PostCSS into your workflow.

What exactly are we talking about?

PostCSS is a tool, or basically, just an API which handles its plugins written in JavaScript.

Comparing to Sass, which has a bunch of features out of the box, PostCSS comes as a blank plate, ready to be filled with the ingredients you need.

Basic Setup

Including PostCSS into your project is not a complicated process, especially if you have a basic experience of using some of the task runners, such as Gulp or Grunt.

As a simple example, let’s take a look at the following gulpfile.js.

var gulp = require('gulp'),
    postcss = require('gulp-postcss'),
    autoprefixer = require('autoprefixer');
 
gulp.task('css', function() {
  return gulp.src('src/style.css')
    .pipe(postcss(
      autoprefixer()
    ))
    .pipe(gulp.dest('dest/style.css'));
});

What we see here is a two step process:

  1. First, we include the main PostCSS module.
  2. Next, we add PostCSS plugin(s) we want to use (which in these short example is the only one – Autoprefixer).

Of course, like with any new gulp plugin which you include into your gulpfile.js, PostCSS module and any additional PostCSS plugin need to be installed first. This can be done in a terminal, with a simple command, familiar to all Gulp users:

npm install gulp-postcss autoprefixer --save-dev

Choosing plugins

So, which plugins do we need? Well, this comes to your individual choice. For an easy start or just for supplementing your preprocessing workflow with some additional power, you will certainly gain an instant benefit with these two:

  • Autoprefixer – probably the most popular PostCSS plugin, used for adding required vendor prefixes. As already mentioned at the beginning, there is high chance that you’re already using this one.
.box {
  display: flex;
}
 
// Result after processing
.box {
  display: -webkit-box;
  display: -webkit-flex;
  display: -ms-flexbox;
  display: flex;
}
  • Stylelint – a linting plugin useful for maintaining consistent conventions and avoiding errors in your stylesheets.
  • If you want to get in more deeper and recreate your basic Sass environment, most likely you’ll also need to require the following plugins:

    $blue: #056ef0;
    $column: 200px;
     
    .menu_link {
        background: $blue;
        width: $column;
    }
     
    // Result after processing
    .menu_link {
        background: #056ef0;
        width: 200px;
    }
    • Postcss-nested – gives us a functionality of unwrapping nested rules like how Sass does it.
    .phone {
        &_title {
            width: 500px;
            @media (max-width: 500px) {
                width: auto;
            }
        }
    }
     
    // Result after processing
    .phone_title {
        width: 500px;
    }
    @media (max-width: 500px) {
        .phone_title {
            width: auto;
        }
    }
    @define-mixin icon $network, $color: blue {
        .icon.is-$(network) {
            color: $color;
            @mixin-content;
        }
        .icon.is-$(network):hover {
            color: white;
            background: $color;
        }
    }
     
    @mixin icon twitter {
        background: url(twt.png);
    }
    @mixin icon youtube, red {
        background: url(youtube.png);
    }
     
    // Result after processing
    .icon.is-twitter {
        color: blue;
        background: url(twt.png);
    }
    .icon.is-twitter:hover {
        color: white;
        background: blue;
    }
    .icon.is-youtube {
        color: red;
        background: url(youtube.png);
    }
    .icon.is-youtube:hover {
        color: white;
        background: red;
    }

    One of the most interesting plugins that we’re mentioning last, is CSSNext. This is actually a collection of plugins that, together, give us a possibility to use the latest CSS syntax today. It transforms new CSS specs into more compatible CSS without a need to waiting for browser support. CSSNext has a lot of features and some of them are:

    • custom properties set & @apply
    • custom properties & var()
    • custom selectors
    • color() function
    • :any-link pseudo-class, etc.

    In your CSS file you can do something like this:

    // Example for custom properties set & @apply
    :root {
      --danger-theme: {
        color: white;
        background-color: red;
      };
    }
     
    .danger {
      @apply --danger-theme;
    }

    Why should you use PostCSS?

    So, if you already have an effective workflow and you’re satisfied with using your favorite preprocessor for some time now, you might be still asking yourself why do I need to learn another tool (or make the switch from Sass)? What are the benefits?

    To answer these questions, let’s summarize some of the advantages:

    • Speed – even though in the meantime Sass got a significantly faster (e.g., LibSass), PostCSS is still the winner here
    • Modularity – reduces bloat; you only include the functionality that you need
    • Lightweight – with previous benefit, you get also this one
    • Immediate implementation – if you want a new functionality, you don’t have to wait for Sass to be updated; you can make it on your own

    Of course, everything’s not ideal and there are also certain drawbacks:

    • Increased complexity – more planning is required (e.g., plugins must be called in a specific order)
    • A different syntax (compared to Sass)
    • PostCSS processing requires valid CSS

    What’s next

    It’s perfectly clear that PostCSS is all about the plugins. At the time of writing, there are more than 200 plugins available (and this number is only getting bigger). So, to go beyond the basics, you’ll need to search for other plugins that will extend this barebones setup.

    Of course, if you find out that some handy functionality is missing, go ahead and solve the problem by making your own PostCSS plugin.

    The post Using PostCSS with Sass appeared first on Inchoo.


    The process of improving online store usability and design your customers will enjoy

    The only point of contact your customer has with your online store, is the designed interface. Ever wondered exactly what they think of it? Long gone are the days when the design was purely a visual discipline. It shifted into a responsibility, where designers are also to be valued for their understanding of the product being built. Design doesn’t just paint the building, it builds the stairs to an overall better usability and collaboration of everyone included. It is necessary to update our processes with thorough understanding of end users, colleagues and stakeholders to make responsible and risk-reduced design decisions. Change through user testing is what keeps the design process relevant and competitive.

    In a single sentence, we work to make incremental user interface and user experience changes based on usability test results and actual user feedback. This process ensures we’ve improved your online store’s usability, not just with our knowledge of best UX practices, but also with real data from your own customers. This process also keeps us proactive, as we’re able to find bottlenecks we would have never thought of or noticed otherwise. Here’s a short overview of the process – and for more information, you can always drop us a line!

    Get to know your online store

    For best insights, a combination of qualitative and quantitative data works ideal. With quantitative data from Google Analytics and Hotjar, we start to understand the site and possible bottlenecks. If possible, we conduct a preliminary unmoderated live study on the current site, which gives us valuable qualitative data to get us working on well founded ideas for the redesign. The live study includes audio and video feedback from customers instructed to complete tasks on usual store flows. An ideal alternative would be to visit our clients for a workshop to get most of the customer information but we usually have a remote kickoff meeting and ask most of the questions then.

    Additionally, to get a jumpstart on our research phase, we initiate a recruiting screener (we use Hotjar) on the live site to get started on gathering our group of relevant test users.

    Analyse from the ground up

    User journeys help us define customer activity, goals, needs, expectations, touchpoints, quality of experience, business goals as well as organisational activities (ideas and solutions). All defined benchmarks and project goals will be tracked to present results of the new design. User journeys also help define user tasks that would be included in usability tests later on. Personas guide in conducting these tests on specific customer profiles (we could pick and choose from recruits as audience segments). Towards the end of this phase, a sitemap is defined. The sitemap assists the team visualise the structure of the navigation and relationship between pages and taxonomy. It also serves as a starting point for wireframes, functional specifications and content maps.

    Image may be NSFW.
    Clik here to view.

    Boldly go where no one has gone before

    During the exploration phase, we try to create solution concepts. The team gets together to provide their specific services for another workshop. We brainstorm & sketch pages (whiteboard), which are turned into medium fidelity wireframes (Sketch) and clickable prototypes (InVision). These encompass all of the ideas and changes the team uncovered so far. The next step is to validate or dispute them through tests on actual online store customers we’ve been recruiting up until this point.

    Test the prototypes, learn and repeat

    We test our minimum viable product to analyze and refine it in as many cycles as needed. Tools like Validately are used to perform unmoderated live studies on our InVision prototype, assigning tasks to recruited test users. Recordings of these tests are then analysed giving attention to test duration, task error rate, task completion and answers to custom follow up questions. So, what do these live studies contain? We receive audible data as well as video of the prototype in use, with our most important findings being the result of user interaction with given prototype through assigned tasks (e.g. buy a specific product, add it to cart and checkout). Tasks can range in complexity depending on what we’re trying to learn. In collaboration with the client, we arrange an incentive to be provided for all test participants. Having direct feedback on early prototypes from relevant users, ensures significantly reduced risk of friction for our future designs.

    Image may be NSFW.
    Clik here to view.

    Suit up your designs

    With early tests done, we’re confident to proceed to the design phase and open up a dialog to communicate moodboard look and feel as well as create the user interface design. The moodboard helps us communicate the purely visual aspect of the project with stakeholders involved. Following the moodboard, the homepage gets the look of the site going while, at the same time, we start developing the style guide for all pages to be designed upon. For the entire user interface, we focus on creating user flows instead of presenting mere pages. Flows are again presented through InVision prototypes and include all of the findings from the previous testing phase both the client and the team agree on implementing.

    Test again, don’t assume

    The completed designs are based not only on industry best practices but also on actual customer feedback. Nevertheless, we test again to see how the designs perform from a usability and user experience point of view, avoiding any assumptions that might not prove to be true. The mechanisms we tested on wireframes might work, but our design influences their effectiveness as well. We repeat this cycle until optimal solutions are found.

    Post launch care

    After launch, we keep monitoring the live site with the goal of optimising the organic conversion rate. We continue the usability tests and include A/B tests to provide basis for further incremental design changes. Tracking previously set benchmarks and goals gives us results to affirm the work so far. After all, your online store is a living and ever changing organism that can be improved at any point.

    All key teams at Inchoo (designers, consultants and developers) are included in the decision making process. At all times we are working closely with our client. This approach is in service of optimal store usability as everyone in the team contribute from their perspective while changes and improvements are based on the real customer feedback. If there is a concern whether an investment in such a thorough process is justified, you should consider the risks of not investing in testing at all and basing your decisions on information that doesn’t include your customer feedback.

    We would love to take you and your customers on this journey and show you the value of our approach firsthand. Contact us to see how we can improve your store’s usability!

    The post The process of improving online store usability and design your customers will enjoy appeared first on Inchoo.

    Using Zend for intensive data processing

    Sometimes there are cases where a lot of data processing needs to be done and using Magento models and resources is either too slow or too intensive for your solution. This is when Zend framework on which Magento is built upon jumps in. Naturally, you can write raw php/mysql functionality for your needs, but if you want to keep your code clean and reusable, using Zend functionality is the way.

    Requests like making feed generator or some other custom scripts that will be too heavy if used with initializing Magento are quite often and can be accomplished by writing a shell script (located in shell directory of Magento project root) which will use only needed resources to complete the task.

    Our example script will fetch a couple of basic product attributes, like sku, name, and weight directly from Magento EAV tables.

    First step of the script is to create backbone from Magento class Mage_Shell_Abstract that includes Mage.php needed for basic initialization project classes. We will also set up our class and its attributes needed for this example. Lets create a file called inchoo-example.php in our shell directory inside Magento project:

    require_once 'abstract.php';
     
    class Inchoo_Custom_Script extends Mage_Shell_Abstract
    {
     
        public $r = null;
        public $c = null;
        public $entityTypeId = null;
        public $productTypeId = null;
        public $tables = array();
     
        public function _construct()
        {
            /** @var Mage_Core_Model_Resource r */
            /** @var Varien_Db_Adapter_Interface c */
     
            // Init connection and tables
            $this->r = Mage::getSingleton('core/resource');
            $this->c = $this->r->getConnection(Mage_Core_Model_Resource::DEFAULT_WRITE_RESOURCE);
     
            // Entity type for products
            $this->entityTypeId = Mage::getModel('eav/config')
                ->getEntityType(Mage_Catalog_Model_Product::ENTITY)
                ->getEntityTypeId();
     
            // Product type ID
            $this->productTypeId = Mage_Catalog_Model_Product_Type::TYPE_SIMPLE;
     
            // Table names
            $this->tables['cpe'] = $this->r->getTableName('catalog_product_entity');
            $this->tables['ea'] = $this->r->getTableName('eav_attribute');
            $this->tables['cpev'] = $this->r->getTableName('catalog_product_entity_varchar');
            $this->tables['cped'] = $this->r->getTableName('catalog_product_entity_decimal');
            $this->tables['cpei'] = $this->r->getTableName('catalog_product_entity_int');
        }
    }
     
    $script = new Inchoo_Custom_Script();
    $script->run();

    Besides extending Mage_Shell_Abstract we made a set up of things we need for script processing in construct, such as determine table names and entity types we will use in data fetching from tables. We will use eav_attribute table to get ID of attribute based on attribute name. ID will be used in catalog_product_entity_decimal table for fetching weight, catalog_product_entity_varchar for fetching name, and catalog_product_entity_int for fetching status of the product (wheater product is enabled or not).

    Next step is the core of our application, Zend sql call that will use data we set up in construct:

    public function run()
        {
            $queryData = $this->c->select()
                ->from(
                    array('cpe' => $this->tables['cpe']),
                    array(
                        'ID' => 'cpe.entity_id',
                        'SKU' => 'cpe.sku'
                    )
                )
                // Join Name
                ->joinInner(
                    array('ea' => $this->tables['ea']),
                    "ea.attribute_code = 'name' AND ea.entity_type_id = {$this->entityTypeId}",
                    null
                )
                ->joinLeft(
                    array('cpev' => $this->tables['cpev']),
                    'cpev.attribute_id = ea.attribute_id AND cpev.entity_id = cpe.entity_id',
                    array('name' => 'cpev.value')
                )
                // Join Weight
                ->joinInner(
                    array('ea1' => $this->tables['ea']),
                    "ea1.attribute_code = 'weight' AND ea1.entity_type_id = {$this->entityTypeId}",
                    null
                )
                ->joinLeft(
                    array('cped' => $this->tables['cped']),
                    'cped.attribute_id = ea1.attribute_id AND cped.entity_id = cpe.entity_id',
                    array('weight' => 'cped.value')
                )
                // Join Status
                ->joinInner(
                    array('ea2' => $this->tables['ea']),
                    "ea2.attribute_code = 'status' AND ea2.entity_type_id = {$this->entityTypeId}",
                    null
                )
                ->joinInner(
                    array('cpei' => $this->tables['cpei']),
                    'cpei.attribute_id = ea2.attribute_id AND cpei.entity_id = cpe.entity_id',
                    null
                )
                ->where("cpei.value = 1 AND cpe.type_id = '{$this->productTypeId}'")
                ->limit(10);
     
            $values = $this->c->query($queryData)->fetchAll(Zend_Db::FETCH_UNIQUE);
     
            return $values;
        }

    Function call to $this->c->select() initializes Varien_Db_Select which handles query build with the help of Zend_Db_Select. Inside app/code/core/Zend/Db/Select.php you can find class specifications and functions used in this example, such as from(), various joins, and other supported wrappers for mysql queries.

    If you’re fetching a large amount of data and you run out of memory, you can wrap up sql call in a while loop and call results one by one, which will result in slower execution but will not waste as much memory:

    public function run()
        {
            $queryData = $this->c->select()
                ->from(
                    array('cpe' => $this->tables['cpe']),
                    array(
                        'ID' => 'cpe.entity_id',
                        'SKU' => 'cpe.sku'
                    )
                )
                // Join Name
                ->joinInner(
                    array('ea' => $this->tables['ea']),
                    "ea.attribute_code = 'name' AND ea.entity_type_id = {$this->entityTypeId}",
                    null
                )
                ->joinLeft(
                    array('cpev' => $this->tables['cpev']),
                    'cpev.attribute_id = ea.attribute_id AND cpev.entity_id = cpe.entity_id',
                    array('name' => 'cpev.value')
                )
                // Join Weight
                ->joinInner(
                    array('ea1' => $this->tables['ea']),
                    "ea1.attribute_code = 'weight' AND ea1.entity_type_id = {$this->entityTypeId}",
                    null
                )
                ->joinLeft(
                    array('cped' => $this->tables['cped']),
                    'cped.attribute_id = ea1.attribute_id AND cped.entity_id = cpe.entity_id',
                    array('weight' => 'cped.value')
                )
                // Join Status
                ->joinInner(
                    array('ea2' => $this->tables['ea']),
                    "ea2.attribute_code = 'status' AND ea2.entity_type_id = {$this->entityTypeId}",
                    null
                )
                ->joinInner(
                    array('cpei' => $this->tables['cpei']),
                    'cpei.attribute_id = ea2.attribute_id AND cpei.entity_id = cpe.entity_id',
                    null
                )
                ->where("cpei.value = 1 AND cpe.type_id = '{$this->productTypeId}'")
                ->limit(10);
     
            $values = $this->c->query($queryData);
     
            $return = array();
            while($result = $values->fetch()) {
                // Do something with data here
                $return[] = $result;
            }
     
            return $return;
        }

    When everything is done you can call script from Magento root with php -f shell/inchoo-example.php.

    This is it!

    Hopefully, this will help you automatize frequent or one time heavy duty tasks for your project!

    The post Using Zend for intensive data processing appeared first on Inchoo.

    On the road again – heading off to Imagine!

    Every year, Imagine is aiming for the new heights. I mean, 150+ speakers with top names from the industry, 3000 attendees and Phillip Jackson as Master of Ceremonies? What could beat that?!

    Twitter feed is filled with #MagentoImagine and #RoadToImagine posts, and they are a delight to watch. The community is showing its strength once again. We can’t say this enough, but there is no community like Magento community! Am I right or am I right?

    At Inchoo, we try our best to show our support to as many Magento events we can, and we’re even organizing some of them again this year (Meet Magento Croatia 2018), so we’ll make no exception for Imagine. Our Project Manager Antonija Tadic is packing her suitcases and heading off to Vegas. Of course, one of the suitcases will be empty so she can bring conference goodies for the rest of us. 🙂

    Since life is too short for long interviews, we did a quick scan through her plans for Imagine.

    Antonija, will this be your first Imagine? What are you expecting from it?

    Indeed, this will be my very first Imagine and I’m going solo. So, it’s going to be one big adventure to say the least. But I’m very excited that this opportunity was given to me, and I want to use it the best I can. At the same time, I don’t want to set some huge expectations for myself. I’m looking forward to gain some valuable contacts, meet with partners, listen to keynote speakers (especially Jamie Foxx 🙂 ). And of course, feel this big community vibe and have some fun with all of these people that breathe Magento!

    What are you looking forward to the most from keynotes and sessions?

    A lot of things, actually. If you look at the speakers list, you will notice many female speakers. As a woman whose passion is the empowering and motivating other women, seeing that many successful women speakers at Magento conference is something I’m very proud of and excited about. Most of them, I haven’t been able to meet before, so I’m looking forward to listening their speeches and meeting them during the conference.

    Did you see a #tipsForImagine? You have been on many conferences, what would be your No1 for surviving an event like Imagine?

    You should ask me that again when I’m back! 🙂

    Businesswise, I would say to prepare well and be social. I think these two are very important. I’ve found this dotmailer’s article very informative – A quick guide for Imagine 2018 for making the most of the event.

    Not business related – bring the sunscreen and comfy shoes. Combination of desert and walking on Strip exploring the city is a winning combination for sunburns and sore feet!

    Since this is your first time in Vegas, what are you planning to visit and experience there?

    I will not have much free time so I will plan my time wisely. I definitely want to explore Las Vegas as much as I can and feel the vibe of the city. Have a blast at some entertainment shows that Vegas is full of, and if there is enough time – visit Grand Canyon during the last day of the trip.

    If you have any suggestions where Antonija should go as a tourist in Vegas, or you wish to meet with her at the conference, leave us a comment bellow, or follow her on Twitter.

    Antonija, bon voyage! Send us lots of pictures! Mingle, and have fun, that’s what conferences are for!

    The post On the road again – heading off to Imagine! appeared first on Inchoo.

    This isn’t your typical Magento Imagine recap

    As many folks that have attended Imagine conference will write a blog post about their experience, I was wondering how my blog post could differ. I don’t want to just share what was happening there, as this whole adventure is hard to put into the words and Magento has already covered it pretty well.

    But let’s see what I will be able to deliver.

    So, how did I end up going to Imagine in the first place?

    This is, indeed, one interesting story to tell. Long story short: we, at Inchoo, we wanted to send a few people, but last minute we decided not to, as we had some organizational changes in the company. Ten days before Imagine, our client (KEH Camera) reached to Tomislav having one ticket available and we agreed I should be the one attending.

    So, in this short period of time before the conference, I have started exploring any useful tips for Imagine newbies on Twitter. Pleasantly surprised there were so many helpful tips and tricks, my stress level had decreased to the minimum. Also, sometimes less time to prepare is the key to having an extraordinary time.

    Alone in Las Vegas

    “How is it possible you had the courage to attend Imagine all by yourself?“ That was the question I was asked more than often, whether by my fellow workers or people I met there.

    Truth to be told, I was a little bit afraid, but more excited to explore the unknown. Also, I had received a huge encouragement from our CEO Tomislav Bilić. He asked me if I was willing to travel alone and I was, at that time, a little bit indecisive, but he got me on board when he said:

    “You know, I firmly believe that the best possible experience you can have there is if you are exploring it alone. My first time attending Imagine, I was all by myself, and I had the best time ever.” 🙂

    First Impression

    After Tomislav’s encouragement, Aron’s four pages (Magento Imagine tips & tricks) and Vesna’s original little notes, with the bag full of Inchoo swag, I was ready to go!

    The eighth Imagine conference, placed in a dreamy Wynn, is the biggest Magento conference in the world. At Imagine – everyone is there! The crème de la crème of Magento: solution partners, technology partners, developers, merchants, different 3rd party integrators and everyone else who breathes Magento to the fullest. It is a huge aggregation of people seeking business opportunities, valuable information, meeting old/new friends and having a good time while doing business they love.

    This year’s topic was Lead The Charge and the Master of Ceremonies was Philip Jackson.

    He shared his inspirational journey how he had decided to do something about himself and had lost the weight he gained. It was very touching. I liked how he expressed that to succeed:

    “You just need to put on your running shoes. You just need to do the next right thing.”

    During the three-day event, there were many great stories on the general or breakout sessions shared, but there is one I particularly liked: Melissa Ben-Ishay (Baked By Melissa).

    Her honest story to success stole the show, in my opinion. She lost a job she didn’t like but was very passionate about her cookies. She was surrounded by right people at the right time, didn’t know anything about eCommerce but somehow managed to become a leader in cute cookie business. The story was pure and from the heart. I could feel it.

    She reminded me that passion, timing and doing what you love are key to success.

    The Magento Roadmap and new community initiatives

    The very exciting thing about Imagine is that you’re able to hear from first hand about new features and functionalities that will come with new Magento releases.

    So, it was announced that:

    Also, during the last general session, new exciting community initiatives were announced:

    I can’t wait to hear more about their strategy and plans.

    Jamie Foxx controversy

    As we were approaching the Jamie Foxx speech, people were asking how Jamie and Magento are connected all over Twitter and on Magento official forum. They were also a bit critical of him being a speaker there. So, to spice things up a little, even Aron poked a little fun at it.

    In my opinion, I was looking forward to seeing and hearing him on the stage. Also, I knew he has a Magento store so for me it made perfect sense. I was hoping he would give a little bit of a motivational speech but also he could talk eCommerce as he and his people run the business.

    From the moment Jamie entered the stage, he owned it. Mark Lavelle was trying to keep up with him, but he was just too strong, doing whatever he wanted with the audience. He, firstly, is a stand-up comedian, then an actor and a singer.

    Image may be NSFW.
    Clik here to view.

    So, he did a stand-up, everyone was laughing from the heart and it was what we all needed. I don’t remember when it was the last time I laughed so loud and hard.

    Jamie was an excellent choice. Even though it was supposed to be an interview one on one, it ended up being Jamie owning the room (stage). I firmly believe that if anyone had second thoughts about him, that they would have changed their mind after his extraordinary performance.

    Women in Magento community

    Not sure if someone has addressed this but the strong message has been sent from Imagine this year.

    As a woman who is very vocal about empowering women of all kinds in my local city and above, I was positively surprised to see many female speakers on the stage this year. And I was certainly excited to hear them lead the stage. It is important that women within this community feel good and are encouraged to give a speech.

    Also, I liked what, during the opening speech, Magento CEO, Mark Lavelle, said for Karen Baker, who is a big influencer on Magento community from early on.

    “She is constantly helping making sure that Magento CEO is never too secure in his job.”

    It was a positive comment that referred to how strong-opinion woman can shake the community up and that her voice is heard. That was the best part of his speech.

    Later on, Mark and Karen even commented on it on Twitter, making sure that if anyone understood this differently than it was presented, it was wrong. I simply loved it.

    There’s always room for improvement

    Since we all strive to be better, there is always at least one tiny little thing that can be improved. Before I say anything, I would like to say that Imagine is one of the best-organized conferences I have ever attended and I do have a number of conferences behind me.

    Sometimes, I only felt like reading from the screen at the stage was too visible to the audience. It might be that the text was too small, or screen was positioned incorrectly, but it was noticeable.

    This is the only thing I can see room for improvement, other than that it will be hard to top this year. 🙂

    What is Imagine all about

    On the very last conference day, I was sitting in the B bar at Wynn hotel with Interactiv4 CEO Ignacio Riesco and talking about all Imagine events he has experienced so far. He was telling me some funny stories and some big decisions he had made each time he attended it. Hearing all that, it changed how I perceive Imagine and its purpose.

    Image may be NSFW.
    Clik here to view.

    Indeed, it is the conference where you can find valuable leads, do business, find a partner at Marketplace, hear some good speeches, exciting news and meet all those people.

    More importantly, it is what you bring back home to your work, which directions you choose after experiencing Imagine, and what you bring back when you return next year. It’s about individual and its personal growth through the business. Does that make sense to you?

    So, for me, from a day one to the very last day, it was an indescribable experience. And it has changed me. A lot. The time will tell how am I going to use this experience. Hopefully, in the best possible way.

    So, what’s next?

    To be completely honest, I left the conference with one strong thought. Next year, on the Imagine stage, I would love to see Inchooer(s). Whether giving a speech, receiving the award for the contribution or being awarded because of the enterprise case-study project. I was so happy for all our friends receiving some huge recognition, but I couldn’t help thinking how Inchoo deserves to be there too.

    It is not a secret that we aren’t satisfied with where Magento 2 is at the moment. Knowing that some of the most critical bugs aren’t fixed, we have started exploring some other solutions as well. Nevertheless, Magento is still more than 80% of our projects, and I hope to see more positive changes in that direction.

    Image may be NSFW.
    Clik here to view.

    The post This isn’t your typical Magento Imagine recap appeared first on Inchoo.

    Unit testing in Magento 2

    Magento 2 comes pre-installed with PHPUnit, an automated testing framework for PHP. It is included as one of the dependencies in Magento 2. Covering the basics of PHPUnit is out of the scope of this tutorial, after a short introduction we are going to focus on the practical example of using PHPUnit with Magento 2. For those who are interested in PHPUnit basics, I would recommend reading documentation or tutorials on the web since it is a very well documented topic.

    What is Unit Testing?

    Unit testing is a level of software testing where individual units/ components of a software are tested. The purpose is to validate that each unit of the software performs as designed. A unit is the smallest testable part of any software. It usually has one or a few inputs and usually a single output. Unit testing process is separate and completely automatic without any manual handling.

    Why Unit Test?

    Image may be NSFW.
    Clik here to view.

    Every once in a while in development world the same question appears. Why would I unit test my code? Is it worth the effort?

    So what would actually be the benefits of unit testing?

    • Find problems early

    Unit testing finds problems early in the development cycle. This includes both bugs in the programmer’s implementation and flaws or missing parts of the specification for the unit. The process of writing a thorough set of tests forces the author to think through inputs, outputs, and error conditions, and thus more crisply define the unit’s desired behavior

    • Facilitates change

    Unit testing allows the programmer to refactor code or upgrade system libraries at a later date, and make sure the module still works correctly. The procedure is to write test cases for all functions and methods so that whenever a change causes a fault, it can be quickly identified.

    •  Design

    Writing the test first forces you to think through your design and what it must accomplish before you write the code. This not only keeps you focused; it makes you create better designs. Testing a piece of code forces you to define what that code is responsible for. If you can do this easily, that means the code’s responsibility is well-defined and therefore that it has high cohesion.

    Writting simple unit test in Magento 2

    Given that we have custom module named Testing under Inchoo namespace (app/code/Inchoo/Testing), our unit tests will reside inside Test/Unit folder according to naming standards.

    Class that we are going to test will be called SampleClass and will reside in TestingClass folder so the whole path would be – app/code/Inchoo/Testing/TestingClass/SampleClass.
    Our class code looks like this:

    <?php
    namespace Inchoo\Testing\TestingClass;
     
    class SampleClass
    {
        public function getMessage()
        {
            return 'Hello, this is sample test';
        }
    }
    ?>

    As you can see this class is pretty straightforward, it just returns simple string.
    Our test that will test getMessage() method will look like this:

    <?php
     
    namespace Inchoo\Testing\Test\Unit;
     
    use Inchoo\Testing\TestingClass\SampleClass;
     
    class SampleTest extends \PHPUnit\Framework\TestCase
    {
        /**
         * @var Inchoo\Testing\TestingClass\SampleClass
         */
        protected $sampleClass;
     
        /**
         * @var string
         */
        protected $expectedMessage;
     
        public function setUp()
        {
            $objectManager = new \Magento\Framework\TestFramework\Unit\Helper\ObjectManager($this);
            $this->sampleClass = $objectManager->getObject('Inchoo\Testing\TestingClass\SampleClass');
            $this->expectedMessage = 'Hello, this is sample test';
        }
     
        public function testGetMessage()
        {
            $this->assertEquals($this->expectedMessage, $this->sampleClass->getMessage());
        }
     
    }

    As you can see this unit test just tests output of the getMessage() method, for test to pass it should be equal to expectedMessage property in the test. Our test extends PHPUnit which gives us assertEquals() method along with many more testing functionalities. There is also setUp() method that is run before testing methods inside of a test, it sets up testing environment along with every class or property that we will need in our test.

    Running unit tests in PhpStorm

    There are multiple ways to run PHPUnit test:

    • CLI
    • Magento Commands
    • IDE integration (PHPStorm)

    In this section we will describe how to run Unit tests over PHPStorm.

    To run unit tests over PhpStorm, IDE needs to be configured first. There is PHPUnit configuration file shipped with Magento 2 – {ROOT}/dev/tests/unit/phpunit.xml.dist

    Configuration file tells PHPUnit where to find test cases to run (along with other information). To make it active copy file in the same folder and remove .dist extension, so the name is now phpunit.xml.

    In the file under the section, we will need to add our test location. In this example, other tests locations are commented out because we don’t want PHPUnit to run all the tests, just the one we have written earlier. So our phpunit.xml should look like this:

    <testsuite name="Magento Unit Tests">
        <directory suffix="Test.php">../../../app/code/Inchoo/Testing/Test/Unit</directory>
        <!--<directory suffix="Test.php">../../../lib/internal/*/*/Test/Unit</directory>-->
        <!--<directory suffix="Test.php">../../../lib/internal/*/*/*/Test/Unit</directory>-->
        <!--<directory suffix="Test.php">../../../setup/src/*/*/Test/Unit</directory>-->
        <!--<directory suffix="Test.php">../../../vendor/*/module-*/Test/Unit</directory>-->
        <!--<directory suffix="Test.php">../../../vendor/*/framework/Test/Unit</directory>-->
        <!--<directory suffix="Test.php">../../../vendor/*/framework/*/Test/Unit</directory>-->
        <!--<directory suffix="Test.php">./*/*/Test/Unit</directory>-->
    </testsuite>

    Now PHPUnit can find our test. To configure PHPStorm to be able to run the tests we need to go to Toolbar > Run > Edit Configurations. After popup window appears we click on plus sign to add new configuration.

    From the drop-down we choose PHPUnit and under Test Runner section we choose Define in the configuration file and check the Use alternative configuration file and finally select phpunit.xml file we edited earlier.

    Now PHPStorm knows about our tests and is able to find them. There is just one more step to complete configuring. PHPStorm should also know about PHPUnit library location and interpreter location. To configure this we should go to File > Settings > Languages & Frameworks > PHP > Test Frameworks. Inside this section, we want to add new configuration using + symbol.

    Under the assumption that we use Composer autoloader we would choose Use composer autoloader option and select vendor/autoload.php. Composer autoloader knows the location of all classes so connecting PHPStorm to autoload.php will finish configuration of PHPStorm successfully.

    To run our test we can click on the Run icon next to the configuration options in the upper right corner of PHPStorm.

    Test should pass and our console looks like this:

    Image may be NSFW.
    Clik here to view.

    Analysis of Magento test

    Here we have PostDataProcessorTest from CMS module. And we are going to break it down piece by piece. This test tests validateRequireEntry() method from PostDataProcessor class. This method validates input by checking if required fields are empty. If some of the fields are empty the whole request sequence is stopped with an error.

    <?php
    /**
     * Copyright © Magento, Inc. All rights reserved.
     * See COPYING.txt for license details.
     */
    namespace Magento\Cms\Test\Unit\Controller\Page;
     
    use Magento\Cms\Controller\Adminhtml\Page\PostDataProcessor;
    use Magento\Framework\Stdlib\DateTime\Filter\Date;
    use Magento\Framework\Message\ManagerInterface;
    use Magento\Framework\TestFramework\Unit\Helper\ObjectManager;
    use Magento\Framework\View\Model\Layout\Update\ValidatorFactory;
     
    /**
     * Class PostDataProcessorTest
     * @package Magento\Cms\Test\Unit\Controller\Page
     */
    class PostDataProcessorTest extends \PHPUnit\Framework\TestCase
    {
        /**
         * @var Date|\PHPUnit_Framework_MockObject_MockObject
         */
        protected $dateFilterMock;
     
        /**
         * @var ManagerInterface|\PHPUnit_Framework_MockObject_MockObject
         */
        protected $messageManagerMock;
     
        /**
         * @var ValidatorFactory|\PHPUnit_Framework_MockObject_MockObject
         */
        protected $validatorFactoryMock;
     
        /**
         * @var PostDataProcessor
         */
        protected $postDataProcessor;
     
        protected function setUp()
        {
            $this->dateFilterMock = $this->getMockBuilder(Date::class)
                ->disableOriginalConstructor()
                ->getMock();
            $this->messageManagerMock = $this->getMockBuilder(ManagerInterface::class)
                ->getMockForAbstractClass();
            $this->validatorFactoryMock = $this->getMockBuilder(ValidatorFactory::class)
                ->disableOriginalConstructor()
                ->setMethods(['create'])
                ->getMock();
     
            $this->postDataProcessor = (new ObjectManager($this))->getObject(
                PostDataProcessor::class,
                [
                    'dateFilter' => $this->dateFilterMock,
                    'messageManager' => $this->messageManagerMock,
                    'validatorFactory' => $this->validatorFactoryMock
                ]
            );
        }
     
        public function testValidateRequireEntry()
        {
            $postData = [
                'title' => ''
            ];
            $this->messageManagerMock->expects($this->once())
                ->method('addError')
                ->with(__('To apply changes you should fill in hidden required "%1" field', 'Page Title'));
     
            $this->assertFalse($this->postDataProcessor->validateRequireEntry($postData));
        }
    }

    First thing to notice is the setUp method, when running tests it is the first test method that is run. Purpose of the setUp method is to set the testing environment – instantiate objects/mocks, populate properties…

    In this example setUp() method is creating mocks using MockBuilder and populating test properties with created mocks.

    There is also PostDataProcessor instantiation, which is logical because testing that class is the whole purpose of this test.

    PostDataProcessor class requires Date, ManagerInterface and ValidatorFactory dependencies in the constructor, that is why there are mocks of those classes created. Mocks of concrete classes are passed to PostDataProcessor constructor which doesn’t complain since mock is an imitation of real class.

    Our test class testValidateRequireEntry simulates case in which parameter array with empty title parameter is sent to validation.

    There are two assertions:

    • messageManagerMock will be called just once, method that will be called is addError() method, which will return ‘To apply changes you should fill in hidden required title field’. Since this is just a mock and not real class there must be developer defined method calls and responses.
    • validateRequireEntry() method will return false, because it will try to validate parameters array with empty field and that will cause error in validation and return false.

    Both assertions will pass in this case.

    Conclusion

    Image may be NSFW.
    Clik here to view.

    In this article, we have just scratched the surface of Magento 2 unit testing and unit testing in general. This topic is very complex and it is not possible to cover a lot in just one article. So I encourage you to read more about it either from PHPUnit documentation or from Magento 2 specific testing tutorials. Best Magento 2 testing examples are located in Magento core naturally. Happy testing.

    The post Unit testing in Magento 2 appeared first on Inchoo.

    Viewing all 263 articles
    Browse latest View live