Genvate

E-Commerce

Google Alert - e-commerce


Warning: A non-numeric value encountered in /home/orionglo/public_html/orionglobecom/wp-includes/SimplePie/Parse/Date.php on line 694
E-commerce has been one of the fastest-growing industries of the decade, but its business model has always been shaky. Despite witnessing a ...
Posted: July 14, 2020, 5:56 am
The first wave of demand for the e-commerce space relies on the shock impact of consumers finding themselves in lockdown. Shoppers who would ...
Posted: July 14, 2020, 5:48 am
After starting by making masks and donating 100 percent of proceeds to Sheridan Story, the female-owned boutique is officially open for online business ...
Posted: July 14, 2020, 5:37 am
Last week, in its meeting, the Committee had resolved to use the growth and momentum of India's e-commerce sector in order to support the revival of ...
Posted: July 14, 2020, 5:26 am
Then entered Amazon in 1995. The e-commerce mogul not only significantly impacted small businesses but big-box stores as well. Target, Walmart and ...
Posted: July 14, 2020, 5:03 am
Online payment solution companies are creating new partnerships and offering new services amid increases in online shopping. By Alexandra ...
Posted: July 14, 2020, 4:30 am
E-commerce allows people to shop all over the world, 24 hours a day, seven days a week. To make it in an industry that is becoming increasingly ...
Posted: July 14, 2020, 3:56 am
Why ? Sharp Insight-rich, Indepth stories across 20+ sectors. Access the exclusive Economic Times stories, Editorial and Expert opinion.
Posted: July 14, 2020, 3:45 am
E-commerce is a term that is also used to describe other online activities, such as internet banking, online auctions and online ticketing. When asking ...
Posted: July 14, 2020, 3:45 am
J/Secure(TM) 2.0, which is JCB's payment authentication programme, will enable safe online shopping from a merchant's website, or mobile application ...
Posted: July 14, 2020, 3:22 am
The demand is likely to be led by ecommerce and ITeS. Job security and health benefits will determine acceptance of offers in the post-Covid-19 era and ...
Posted: July 14, 2020, 3:11 am
The comprehensive study on the Retail E-Commerce Packaging Market provides crucial insights to the stakeholders who are vying to solidify their ...
Posted: July 14, 2020, 1:41 am
The Austin, Texas, based e-commerce company raised over $200 million ... while still providing good amounts of working capital for the business.
Posted: July 13, 2020, 11:14 pm
Businesses are turning to online platforms for various transactions. The Coronavirus Pandemic, though disrupting business activity, has created a huge ...
Posted: July 13, 2020, 11:02 pm
It's understandable that many brands are feeling the urge to rush onto e-commerce platforms, but doing so without a plan could cause more harm than ...
Posted: July 13, 2020, 10:52 pm
To find new revenue, Medialink is planning to venture into e-commerce later this year with a site selling anime-themed clothes, toys and other ...
Posted: July 13, 2020, 10:52 pm
Impressive will help Acer grow their eCommerce platform, as the brand reacts to the significant uptake COVID-19 has had on the technology sector.
Posted: July 13, 2020, 10:02 pm
Pier 1 Imports has accepted a raised offer for its intellectual property, data and online-related assets. Retail Ecommerce Ventures LLC will purchase ...
Posted: July 13, 2020, 8:26 pm
Established in 1918, RTW Retailwinds operates nearly 400 locations in 32 states. But even with a substantial e-commerce business, the company joins ...
Posted: July 13, 2020, 7:30 pm
A New Syndicate Global E-commerce Packaging Market Study is added in HTF MI database compiled covering key business segments and wider ...
Posted: July 13, 2020, 6:40 pm

Blogs

This post is an excerpt from Brian Beck's Billion Dollar B2B Ecommerce: Seize the Opportunity, available now

Channel conflict is largely centered on product price

When a manufacturer or brand sells directly to the end user of the product, they always have more overall margin to work with. Manufacturers have the ability to sell products at a lower price, as they can now capture the “retail” profit margin on the product.

The power to disintermediate is the central underlying factor that defines channel conflict, and must be managed by manufacturers and distributors alike to avoid alienating traditional reseller channels.

Resellers are often responsible for a very large percentage of a manufacturer’s revenues, and this must be respected. In addition, many resellers add real value to the end customer -- thus a balanced approach is necessary.

5 key questions to ask regarding your pricing approach

  • What “retail” or resale price (the price paid by the ultimate user of your products) has the market supported over the past two to three years for your products through each resale channel?
  • What are the pricing trends in each resale channel?
  • Based on pricing history and trends, how much overall margin do you have to work with--your cost versus the ultimate resale price?
  • How does each sales channel price your products? And do they typically offer discounts on your (or their) list pricing? Do they use promotions or similar tools via Ecommerce/direct channels? If you sell thousands of products, you might approach this at a category level instead of product level to make data analysis more manageable.
  • How is the price to the end customer typically determined and managed? E.g. Are there contractual, private pricing relationships that are negotiated, or is the product bought at a published resale price, such as an MSRP from a website or catalog?

5 tactics to apply to direct-to-buyer pricing

Your pricing approach for direct selling efforts via Ecommerce can include the following tactics:

Set a clear Manufacturer’s Minimum Advertised Price (MAP) and enforce it. This is a channel-agnostic and well-documented policy that informs all channels what you regard as the pricing that is acceptable to present publicly. Note that MAP policies can be difficult to enforce legally, and need to be deployed in a channel-agnostic manner, but they are clear lines in the sand that help manufacturers and brands manage selling channels. Of course, you should honor your own MAP policy on your Ecommerce site, at least for any public-facing pricing.

Set public-facing pricing i.e. the pricing that anyone looking at your web site sees--to a level that is reflective of your other channels’ web site pricing. One of my clients does this so that no matter where current or prospective customers look on the Internet, they see the same pricing, so the buyer determines who to purchase from based on other factors, such as service, ease of purchase, and delivery time.

Consider leveraging Ecommerce functionalities to offer special pricing to specific customers (or groups of customers), only displayed behind a web site login (in essence, a private web site). For example, you can use digital tools to show MSRP pricing on the public-facing web site, but once a customer logs into their wholesale account, they are presented with customer-specific pricing levels, perhaps reflective of contracted discounts.

Moreover, if you have different customer segments using the same web site, Ecommerce tools allow you to expose price selectively based on the customer login and account type. In other words, one segment gets to see one price, while a different, unrelated segment is presented with a different price. Some B2B Ecommerce sellers I work with don’t even display pricing online until a customer logs in. The point is that there are a variety of ways to use tools to manage pricing visibility on your web site.

Consider assortment variation as another tactic that can be very effective in managing channel conflict. You might repurpose an existing product line with a different brand name or feature set, or even create a new product line just for direct selling. For example, I work with a well-established manufacturer of a wide variety of garden and home decor products. The company generates over $350 million in revenue per year, and sells its products to numerous major retailers as well as distributors. They recently started selling directly to consumers under a separate brand, but leveraging the same great products. This portion of their business, which is entirely Ecommerce and marketplace driven, has quickly grown to over $20 million in revenue, and continues to grow by 50 percent per year.

Don’t forget other tools, such as free shipping that you can use to influence the sale without impacting the price that is displayed or publicly accessible to your channels. Whatever your approach, you must understand and track pricing by channel on an ongoing basis.

A good plan will recognize and anticipate potential channel conflicts around price and address price fluctuations as you go to market. As you execute an Ecommerce strategy, you may find opportunities to be more aggressive within some customer segments or geographies, particularly in new market segments that are not addressed effectively by your traditional resellers.

The bottom line is that you need to know where and when you can be aggressive, and where and when you can’t.

Elastic Path is proud to sponsor Brian Beck’s Free Virtual Book Tour series featuring a breakdown of key concepts from the book along with real-world success stories from eCommerce leaders at Johnstone Supply, Illumina, Pella, Cardinal Health, and more. Register today

Author: Brian Beck
Posted: June 26, 2020, 7:17 pm

Augmented and virtual reality is gaining traction with brands and retailers looking to enhance buying experiences. And it’s getting easier to adopt and inject into product pages thanks to headless commerce APIs and progressive web applications (PWAs).

Today’s late-model mobile devices are shipping with enhanced native AR support, with toolkits like Apple’s ARKit and Google’s ARCore offering developers a way to connect AR to the storefront. And unlike early iterations of AR for ecommerce that required native app downloads, today’s experiences can be launched through mobile browsers.

While web-based AR still can only be used by shoppers with compatible hardware and operating systems, within one more upgrade cycle (~2 years) we can expect AR accessibility to hit critical mass.

For those who want a head start, adding 3D models and AR to product pages may be simpler than you think.

How Herschel Supply uses AR on product pages

The first step to AR-izing products is to leverage a 3D scanning application such as Qlone, ThreeKit or Herschel’s choice, Vertebrae. Many scanning apps allow you to create models in-house with just a smartphone (All3DP has a great review of popular options).

Because 3D scanning can be a time consuming process, consider starting with your top-selling evergreen products, and choose one master SKU variant for each (e.g. color).

The beauty of capturing 3D images is any visitor can view them when added to your image gallery without AR -- so it’s time well invested! For users with AR-compatible devices, you can display an icon or toggle to switch to AR mode.

You can use device targeting to show AR options only on compatible devices, or display it for everyone. Herschel displays the option for all Web visitors through any browser. Clicking View 3D and AR launches a QR code with instructions on device and OS requirements.

Herschel AR launch with QR codeHerschel puts 3D and AR on product pages

 

Herschel AR product page 2

Users can then “place” the selected item, which appears true to scale, and drag, rotate or view the item from any angle.

Herschel AR 3D image through camera

Check out this experience yourself on Herschel.com

Using ARKit and ARCore (and more)

Developers can use Apple’s ARKit to build experiences for newer iPhones (8 and higher), tapping into their native neural processors and AR features. (Technically, iPhone 7 also supports AR, but with slower, more battery-sucking performance).

Serving AR to iPhones through the Safari browser requires ARKit’s Quick Look extension (iOS 12), which enables 3D models to be viewed in the real-world context through a user’s camera.

How Quick Look works

To play nicely with Quick Look, your 3D-scanned images must be saved with the USDZ file format and hosted through your CDN. AR tags will only display on compatible devices and when a corresponding USDZ file exists and is detected through your CDN (and called to the storefront via page view). This ensures empty tags don’t litter your catalog.

When a user visits a product page with a corresponding USDZ file and activates AR mode, the file will be loaded into the Safari browser and rendered in a full-screen Apple AR viewer app.

For a more technical description of how to use ARKit for ecommerce, our own Senior Software Engineer Shaun Maharaj documents how our Product team built AR into product pages using Quick Look in our React.js demo store/reference application.

(Elastic Path customers: check out our documentation for ARKit Quick Look integration)

AR of product page using ARKit QuickLook

Google’s Scene Viewer

For Android, Google’s Scene Viewer is ARCore’s answer to Quick Look. The twist is you can leverage Google’s AR wrapper with both Apple’s USDZ and Android’s GLB 3D file formats with a single line of code. Users of both mobile OS can launch 3D products in their respective viewers (Quick Look or Android’s Model Viewer), or with Magic Leap headsets.

However ARCore’s framerate and accuracy is reportedly not as good as ARKit’s at this time. And while Android today has a larger install base than iOS, it’s more fragmented, and not all users are on the latest compatible OS version (Q).

Both Quick Look and Scene Viewer support sharing of AR models through SMS, email, messenger apps and chatbots.

Third-party AR viewers

Third-party tools like Vectary are also available for merchants who want to serve WebAR to both iOS and Android users though a universal viewer. Vectary’s AR features are opened in its own viewer via an iFrame you place on your product pages (enabling desktop users to play with 3D images as well).

Vectary also aims to make it easier for designers to add AR experiences without code. Its “CMS for AR” approach pushes changes from its 3D editor directly to your site.

What about WebXR?

WebAR/VR (now WebXR) APIs were conceived in the spirit of democratizing development and accessibility beyond iOS and Android, and are supported by Microsoft (including Hololens devices), Google and Firefox (desktop and mobile), Samsung Internet, Servo (desktop) and Microsoft Edge and Hololens devices. (Phew!)

The biggest challenge with WebXR is Apple’s on the fence whether they’ll support it. For that reason, we advise ecommerce developers and designers to focus efforts on ARKit, ARCore or 3P tools that play nicely with iOS.

AR viewer limitations

Quick Look, Scene Viewer and their 3P cousins work best when you want to render a single, fixed model in AR versus create custom UI environments or display multiple images (products) within the same view.

In other words, they’re:

  • Great for virtual try on of a single item, more difficult building an outfit.
  • Great for overlaying an image through a customer’s camera feature, less good for arranging multiple products in a virtual space.
  • Great for seeing a product in context, not so much for swapping colors, finishes or patterns with a single tap.

More advanced AR requires not just 3D imagery, but scaled and textured modeling, and plane and spacial recognition to display multiple products or to display them in a truly virtual environment. This is important for ecommerce, as less-than-realistic AR and VR can lead to disappointing unboxing experiences -- especially when patterns, finishes and textures are key attributes of a product.

Reality Composer and AFrame

To handle more advanced AR (and even VR), enter ARKit’s Reality Composer, a prototyping tool that allows you to build AR experiences within “virtual space.”

You may have guessed it, but yes -- Apple’s Reality Composer requires USDZ files (and .reality files) and doesn’t extend to Android, though you can build in Reality Composer and serve through a 3P viewer like Vectary to serve both platforms.

Reality Composer is also friendly to non-techies. Amir from Augmentop shares how they pulled price, customer reviews and a cart button into an AR experience in about an hour, with no coding skills.

Augmented reality on a product page with Reality Composer

Alternatively, virtual reality can be built with AFrame (leveraging WebXR), an open source, HTML-based framework. For example, the Elastic Path reference store chose this option to work natively through the Web and PWAs. (Test our sample product in a VR world here - click the “viewer” icon in the top left of the image).

Elastic Path VR Reference Store

Build AR/VR without coding

For those that want to explore AR and VR without technical skills, SaaS platforms and toolkits are available.

Why headless commerce for AR/VR

Accessibility and ease-of-adoption

The first step out of the walled garden of native apps is to choose a headless commerce platform to support a PWA front end. This enables you to leverage web-friendly AR without an app download.

Apps also don’t allow easy navigation between ecommerce pages and features like cart, checkout and browsing history, nor can app AR be accessed from any merchandising zone or product page in your catalog. While possible, it’s complicated, and the user may not have the appetite to keep switching between the app and the Web.

Choosing to deliver AR/VR through mobile browsers helps ROI. It’s critical that your new AR and VR features are available and easily accessible to as many customers as possible. Reducing cross-platform development and maintenance is another benefit.

Direct access through APIs

One of the biggest advantages of using headless commerce, and microservices (or Packaged Business Capabilities) in particular, is direct access through APIs. This helps you inject product data, prices, attributes, promotions, customer reviews and more into your AR or VR experience.

Because your commerce APIs have all the technology you need and can connect with device and toolkit APIs, there’s not much that needs to be recreated.

In-AR transactions

Today’s mobile AR viewers don’t natively support checkout and payments -- users must exit the viewer and return to the product page. But with headless commerce and APIs (and some clever development), you can support frictionless AR and VR shopping, keeping your catalog, cart and even payments within one experience.

Support “future commerce”

As toolkits evolve, expect to see more shoppable AR in-store. For example, walking through aisles and seeing prices, promotions, loyalty rewards and endless-aisle capabilities appear through your camera. Or play gamified “Pokemon Go” shopping games anywhere.

Or personalize with cross-sells and upsells appearing as customers browse a physical shop, engage with signage and outdoor advertising, or explore a virtual shopping experience through a headset or other wearable. Connected to CRM, personalization engines, location awareness and even browser history, there are many ways AR and VR will become part of the new “digital-in-store.”

Author: Linda Bustos
Posted: June 23, 2020, 10:56 pm

Prior to COVID-19, analysts were bullish on AR and VR for retail and B2B commerce. Deloitte predicted 100M consumers would use AR to shop online or in-store in 2020, while Gartner claimed 46% of retailers were planning to deploy AR or VR in turn. IDC forecast retailers and manufacturers would drop $1.5B and $1.4B respectively on AR/VR experiences.

Consumers are quite receptive to AR and VR shopping (at least the surveys say) -- perferred to voice and social shopping by a landslide. Retail Perceptions found:

  • 1 in 3 shoppers already uses AR
  • 71% of consumers would shop at a store more often if they offer AR
  • 47% would prefer to use AR both in store and online
  • 40% would be willing to pay more to brands that offer AR

Is AR/VR still viable in 2020?

Despite the apparent demand from business and consumers, COVID-19 has undoubtedly thrown a monkey wrench into AR and VR investment. Even in pre-pandemic conditions, 64% of executives said lack of budget was a roadblock to adopting the technology, with lack of internal resources (55% and executive buy-in (42%) also factors.

In our current climate, “fail fast” experimental projects are a luxury many brands and retailers can’t afford.

Any investment in AR or VR in 2020-2022 requires a rock-solid business case and anticipated ROI in a time when even conventional commerce projects are put on hold or canceled.

For AR/VR projects that are greenlit, fast time-to-market and efficient use of resources is key.

The good news for brands and retailers is AR and VR no longer need greenfield investment in bespoke agency development or fancy hardware. Both iOS’ ARKit and Android’s ARCore support AR capabilities that can be baked into native and progressive web apps, and WebAR / WebVR supports cross-device experiences.

To evaluate whether investment is worth the risk, let’s explore the ways AR and VR can mitigate some of the pain brought on by the pandemic, and the available solutions.

3 ways AR/VR can help brands and merchants thrive post-coronavirus

Replace the photoshoot

New social distancing protocols, restricted travel and leaner budgets are all disrupting the traditional photoshoot.

No worries for brands and retailers, there are at least two AI and AR solutions (Zeekit and VueModel) that can close this gap. ASOS and Milaner are just two retailers using the technology, and the results are pretty mad decent.

ASOS AR AI model photoshoot

With Vue.ai’s VueModel, for example, you can use your own models or choose from Vue.ai’s roster. The tech allows you to superimpose garments on your model, choose different poses and backgrounds, and customize ethnicities and body types.

You can also upload flat product photos and VueModel’s AI will predict the size and fit and choose a model for you. The company claims its tool can shave ¾ off the cost of traditional photo shoots.

Considering live photoshoots have long been manipulated by angles, lighting, garment pinning and Photoshop, AR combined with predictive AI can offer customers a more realistic vision of a garment to boot.

The advantages of AR photoshoots go beyond staying safe during COVID-19:

💡The ability to offer model diversity and allow customers to filter by model attributes is a value-add for any merchant, giving customers an enhanced way to personalize their experience. #inclusivity

💡Images go beyond just product photos, complete looks can be styled for product pages, banner images, social content, ads, lookbooks and “shop the look” content quickly, affordably and at scale.

💡Virtual styling at scale enables greater merchandising personalization. Fashion preferences in LA, the Midwest, New York and Miami can be wildly different, for example, as can preferences by age. The ability to serve different banners to different customers based on browser and profile data can tailor relevant experiences.

💡Got a product with a low click through or sell-through rate? The problem could be the model, from their pose, to their facial expression or other variable. The ability to instantly re-shoot or even A/B test models can have a serious impact on sell-through.

Enhance try-before-you-buy

Virtual try-on is hardly a new feature. We covered one of the earliest adopters EyeBuyDirect’s Wall of Frame back in 2008.

Today there are few eyewear brands and retailers that don’t offer a similar tool. And virtual try-on has infiltrated everything from cosmetics to apparel, home furnishings to footwear -- evolving to include 360° images, 3D mesh models and interactivity for more realistic visualization.

For example, Quytech offers merchants a tool that enables customers to 3D scan their feet, body or face to virtually try on products. Competitor Vyking.io allows customers to move, even walk around in their augmented kicks.

Virtual try-on was once an online-only experience (or delivered through expensive “magic mirrors” or Oculus experiences in-store). But today’s need for low-touch, hands-off and hygienic physical retail has increased demand for self-serve AR/VR through personal mobile devices.

Today, many shoppers carry compatible (late model) devices that now support AR and VR, and can access these features through native or progressive web apps. But the challenge remains that many retailers are serving experiences that are device and even browser specific -- even through PWAs.

For example, Kendra Scott's jewelry try-on is built with Apple’s ARKit. Though served through a PWA, the feature only works in the US for newer iPhone models through the Safari browser.

Kendra Scott ARKit AR try before you buy jewelry

Are buyers ready for virtual try-on?

💡With online returns amounting to $350-$550B per year, virtual try-on pays for itself if it can increase conversion while reducing returns for “sight unseen” purchases.

💡However, it’s better to use no AR than poor AR. Virtual try-on can actually increase returns if it’s misleading or misrepresents fit or form.

💡Today, ARKit only works with late-model iPhones (7 and higher), and Android’s ARCore has similar constraints. We’re still an upgrade cycle away from critical mass adoption of AR and VR-compatible mobile devices, meaning it could be another 2 years before experiences become universally accessible.

Support social distancing in high-touch sales

Virtual experiences also have impact on B2B, including industrial manufacturing. Supporting remote demonstrations, trials and training for specialized equipment replaces the on-site sales or technician visit, reducing costs and increasing efficiency.

PTC’s Vuforia Chalk is an augmented reality that enables on-site and offsite employees and brand reps to collaborate on demos, training, operations and repair. They like the tool to “Facetime with augmented reality superpowers." (To help brands and manufacturers through the COVID-19 crisis, PTC is offering Chalk free).

For example, Howden uses mixed reality (both AR and VR) to create step-by-step instructions from existing 3D machine models.

Howden mixed reality experience for B2B

Virtual experiences can also be translated to website content. B2B brands and manufacturers who both sell on and compete with Amazon are often hamstrung by the requirement to offer content parity across channels (in other words, Amazon won't let you have better content on your own site than the marketplace).

The ability to add AR/VR content such as demos and tutorials on the branded website provides a value add for new and existing customers that Amazon (and many competitors) can’t match.

These are just 3 ways you can use AR and VR to enhance digital in the “new normal” of social distancing, increased hygiene concerns and digital pre-shopping.

Next post, we’ll dig even deeper on how to use AR and VR with headless commerce. Are you subscribed?

Author: Linda Bustos
Posted: June 10, 2020, 9:56 pm

Pre-COVID, 40% of shoppers said buy online, pick up in store (BOPIS or click-and-collect) was their most valued retail shopping experience, with BOPIS behavior growing 30% from 2018.

Today, curbside pickup is the BOPIS delivery model of choice (dubbed BOPAC - or "buy online pick up at curb") as we navigate this “new normal.” Both consumers and merchants have adapted fast, with curbside pickup doubling year over year and 59% of consumers polled in April saying they're likely to continue choosing curbside pickup due to the pandemic.

Among retailers not offering curbside pickup, one third say they're scrambling to offer the service as quickly as possible.

But what was designed for convenience can be anything but in today’s chaotic climate of social distancing, limited store hours and reduced staffing. Many retailers require customers to call from the parking lot (and battle the busy signal), endure lengthy in-car wait times or even queue to get into the store to flag down an associate.

What’s more, many new-to-curbside merchants are still working out the kinks in their systems and processes, and lack the back-end management that helps connect customer service with curbsiders. 

6 ways to close the curbside pickup experience gap

Support SMS

Emails can get lost, buried and worse, go unread -- while 90% of texts are read within 3 minutes of receiving them. Thanks to their visibility, convenience and speed, more than 50% of consumers prefer text communications with businesses over telephone (and supporting SMS also keeps your phone lines free for those that prefer voice).

But the best text experiences aren’t one-sided. Supporting two-way texting between customers and staffers provides the optimal experience -- even if it’s an AI-driven chatbot pushing back canned responses (40% of shoppers don’t care if it’s a bot or a human texting, and 27% can’t tell the difference).

Enhance pickup logistics

With social distancing policies still in place and lineups to get into most stores, giving customers the ability to minimize curbside lineup waits (or stay in their cars!) helps speed the pickup process. SMS messaging enables customers to notify staffers when they’ve arrived, and even remain in their cars by sending details on their location and car make and model.

For example, Object Edge’s Curbspot curbside solution provides an interface to enter this info:

curbspot order is ready push notification

Consider BOPIS-bots and GPS

To truly Uber-ify curbside pickup, consider integrating BOPIS with chatbots and GPS in your progressive web application (PWA). Customers can initiate their “trip” as they leave their home through your web app, and pickups can be right-timed with their arrival and location on the curb or in your lot.

Personalize post-purchase

Without the in-store visit, curbside orders lose the opportunity to drive additional in-store purchases -- 85% of BOPIS shoppers say they’ve bought additional items in-store during a pickup trip.

To pick up this slack, consider pushing personalized cross-sells and upsells post-online purchase to “add to” an order based on popular in-stock items at the pickup location, a customer’s local purchase history or other add-ons. In this post-purchase flow, remind customers that making fewer trips for more items is most convenient during times of social distancing, reduced store hours and more frequent stockouts.

Promote post-pickup

Another way to boost the basket value of each curbside pickup is to offer incentives to shop again. For example, Object Edge’s Curbspot supports post-pickup coupons and deals for future purchases (combined with an opportunity to send feedback on their experience).

curbside pickup post purchase order

Support store associates

Don’t forget the user experience on your back end! Giving staff a better way to manage curbside orders and co-ordinate with customers (especially when customers can’t be found or are no-shows) that’s intuitive and easy to onboard to is key to a seamless curbside experience and customer satisfaction.

curbside pickup back end interface

If you’re using a plug-and-play accelerator such as Curbspot, make sure its back end is user-friendly. If you’re building a custom curbside pickup solution, remember the features you want to provide your staff: view all pickup orders, view customer-ready orders, view orders out for pickup (or to pickup within a certain time frame), and completed pickups.

How headless commerce supports curbside pickup

API-driven headless commerce allows you to create new experiences such as adding new curbside pickup flows to cart and checkout, connect inventory and order management to chatbots and SMS messengers, and plug these pieces into progressive web applications or customized UIs for back end users and store associates -- all without adding code to your platform itself.

If you already have a BOPIS solution and want to adopt enhanced features now, you can add Curbspot to any ecommerce platform in about an hour with no coding and no cost for 30 days. Check out GetCurbspot.com for more details

For more articles on how headless commerce supports digital transformation, check out our headless commerce archive.

Author: Linda Bustos
Posted: May 26, 2020, 4:48 pm

In the “new normal” of disrupted supply chains and local stores doubling as online fulfillment centers, accurate real-time data is even more critical, as is the ability to reserve stock. Shoppers have disappointing out-of-stock experiences once every three shopping trips, on average.

Traditional endless aisle helps in-store shoppers locate out-of-stock items to order online or have shipped to store, and was initially offered through stationary kiosks and tablet-toting sales associates (Endless aisle 1.0).

For retailers that have integrated omnichannel ecommerce, customers can use the online store as an endless aisle tool through their personal mobile devices (Endless aisle 2.0). A 2019 consumer survey by iVend found 93% of consumers already use their mobile devices to pre-shop before visiting a store.

Post-COVID, using a personal device is an even stronger use case. Cautious consumers are touchy about touchscreens, and huddling over a tablet is too close for comfort.

What’s more, the in-store shopping experience (for stores that are open) has become more painful. Lineups to get into the store and the risk of finding items out of stock makes the routine shopping trip a more considered activity.

Providing shoppers the ability to pre-shop at home or on a mobile device with real-time inventory availability by store location, reserve store stock (or source from nearby stores) and pick up orders safely and swiftly is the highest value investment you can make to your customer experience during this pandemic (Endless aisle 3.0).

Enhancing mobile endless aisle with PWAs

One of the most popular reasons to adopt headless commerce is to support progressive web applications. PWAs allow you to incorporate native smartphone utilities like the camera, GPS and push notifications with your ecommerce and omnichannel experience.

For example, a PWA can:

  • Help mobile users locate product pages faster by using their camera to scan barcodes, QR codes or even use image recognition
  • Provide turn-by-turn directions in-store to locate products on a digital cart or shopping list on the shelf (wayfinding)
  • Help customers locate in-stock products closest to their current location
  • Send push notifications when a customer requests a back-in-stock alert or BOPIS order is ready for pickup
  • Send push codes to unlock pickup lockers (or integrate with authentication apps like Okta)

Unlike native apps that must be developed for different operating systems and require user download, PWAs are supported across mobile browsers. PWAs also work offline (so long as your customer has connected to it online before), making in-store mobile use even easier and more accessible.

Why headless commerce for mobile endless aisle?

To optimize your PWA for these emerging and urgent use cases, you want the ability to compose applications quickly, using only the relevant commerce services you need for a given touchpoint and the logic required to support the context of the experience.

“Going headless” (decoupling your front end from your back end commerce platform) is the first step towards composable commerce. But to truly launch new experiences that connect core commerce to new touchpoints (“heads”) like PWAs, chatbots/messengers, wearables and Internet of Things, you want modular services that can be remixed fast without hardwiring or adding code to your back end system.

The importance of real-time inventory visibility

You also want your touchpoints to connect with real-time data -- especially if you use local store inventory for online fulfillment which has a double-impact on in-store stock availability.

And it’s not just your own inventory you may want to connect to. If you supplement ship-to-store and endless aisle with drop-ship suppliers, you need real-time visibility and integration with their stock status and pricing. You may even integrate with shipping carriers and rate shopping tools to determine if ship-to-store and ship-to-customer orders can be delivered on time and at a reasonable cost.

Real-time integration with the right systems (and the ability to add or remove data sources as needed) is a key benefit of headless and API-driven commerce. Want to learn more? Get in touch.

Author: Linda Bustos
Posted: May 16, 2020, 7:27 pm

Despite the buzz around AI’s potential to make ecommerce as personal and natural as an in-store visit, fully-baked intelligent shopping assistance remains several years away.

In the interim, online retailers are embracing chatbots as a first toe-dip into conversational commerce using them to expedite customer service and provide novel ways to discover products.

According to LivePerson’s analysis of over 20 years of live chat logs, 70% of ecommerce chat inquiries can easily be handled by automation. While typical live chat is offline after business hours, chatbots are available 24/7 and reply instantly, unlike human agents who may throttle several chat threads at a time.

Today, with many call centers closed or running minimal hours due to the coronavirus crisis, live help chatbots are even more valuable to ensure customers are cared for as quickly as possible.

Online shoppers value speed-to-resolution

Fifty-six percent of online shoppers say they prefer to resolve issues through messaging apps than to email or call customer service, and 40% say they don’t care whether a chatbot or a person answers their customer service questions. Fifty-six percent of online shoppers say they prefer to resolve issues through messaging apps than call customer service.

Chatbots don’t have to have all the answers. Pre-built dialogs handle routine inquiries, with the ability to hand off to a live person or help desk when a bot is stumped. Chatbots can also collect names, email, order numbers, account preferences and marketing opt-ins and pass this data back to different databases and tools.

How online retailers are leveraging chatbots

Messenger merchandising

Beyond customer service, messenger apps like Facebook and KiK are used by a number of retailers including Lego, Sephora, ASOS, Marks and Spencer and H&M. With 67% of consumers open to shopping through a chatbot, these interactive shopping tools provide a novel way to explore online catalogs, engage with content and get personalized recommendations in a conversational context.

Messenger chatbots offer an alternative to the traditional “search-and-browse” ecommerce experience. Most support voice (through speech-to-text) and image recognition via a mobile device’s camera, such as ASOS’ Enki assistant (below). Advanced messenger chatbots support natural language processing (NLP) to infer intent and make intelligent recommendations.

ASOS enki chatbot

Image Credit: Econsultancy

Such chatbots are typically deployed with “builder” platforms like Chatfuel or ManyChat, with varying features and capabilities. Some chatbot builders support in-chat payments through Stripe integration. Others integrate with popular email and review platforms, store locators, CRM and help desks through tools like Zapier. (Check out our review of 14 chatbot solutions for ecommerce).

For remarketing, Facebook Messenger bots support automated sequencing campaigns to send promotional offers, personalized alerts and cart recovery messages through Facebook Ads. User “tagging” can be used for analytics, segmentation and remarketing strategies.

Messenger chatbots versus native assistants

American Eagle Outfitters is another retailer embracing messenger shop-bots. Through AEO’s Gift Bot, KiK and Facebook Messenger users can take a short quiz to discover gift suggestions served by the bot. Shoppers can also upload photos of clothing and shoes that fit their style and explore visually similar items.

American Eagle Facebook Messenger Chatbot

Image credit: Chatbot Guide

While both American Eagle Outfitters and its sister brand Aerie are accessible through one website and a unified cart, AEO chose to create different bots for each brand, individually deployed to both KiK and Facebook Messenger platforms.

American Eagle Outfitters chatbot

“(We) wanted to have separate chatbot experiences for each brand,” says AEO’s Chief Technology Officer, Colin Bodell, “because not every customer shops both brands or would find information from both retailers relevant.”

While this strategy keeps bots tightly focused, it also creates siloed experiences within each brand and platform. In addition, the shopping experience is restricted to what’s possible within a chat dialog. Category browse, site search, product content, reviews and shopping carts are inaccessible until a user clicks out of the messenger app and into the e-store (where all chat context is lost).

Why build a native chatbot?

While messenger chatbots are popular and integrate with platforms your customers are already using, you’re limited to the features and functionality provided by the messenger platform (which are all equally available to your competitors). Your chatbot strategy may call for support for key use cases, such as:

Keeping conversational commerce in context

Integrated with and accessible from your online store, native chatbots preserve context while keeping your website’s rich features at shoppers’ fingertips. For American Eagle Outfitters, a native bot would allow shoppers to explore both brands as seamlessly as they can on the website, or scope to their preferred brand or department. All it takes is asking bot users their preferences early in the dialog.

Unlike platform-specific bots, site-native chatbots are accessible to all site visitors and are more easily discovered through the website than chatbot directories within each messenger app.

Offering enriched capabilities and personalization

Native chatbots also enable you to integrate commerce features beyond what messenger bots support. For example, APIs can pull data from personalization and promotions engines, customer accounts, loyalty programs and order management. You can also build chat experiences that support bundles and carts, connect with a wishlist or support mobile wallets and one-touch payments.

Predictive analytics can be used to trigger proactive chat based on browsing behaviors such as slow scrolling or pauses on a product list page, excessive “pogo-sticking” navigation between pages or frequent repeat visits without a conversion. Proactive chat may also trigger from certain cart conditions like “over X items in cart,” or “over $X in cart.”

Supporting omnichannel and endless-aisle

APIs can sync chatbots with microservices such as inventory management for real-time endless-aisle. Instead of struggling through a mobile website’s search and category menus, an in-store shopper could scan a barcode or upload a picture of any product to locate additional sizes, colorways or quantities through chat. Out-of-store shoppers can use the same feature to reserve-in-store. GPS and maps can “send location” and provide turn-by-turn directions to local stores.

Protecting sensitive information and supporting secure in-chat transactions

To facilitate transactions and handle customer support inquiries with sensitive data such as passwords, chatbots require encryption and other privacy controls to comply with PCI, GDPR and PIPEDA.

While some third-party chatbot builder platforms such as Ada meet PCI DSS standards, you’re limited to the payment processor(s) supported by the platform (namely Stripe). API-driven, headless commerce and microservices give you the flexibility to natively support in-chat payments that work seamlessly with your existing digital platform and processors.

Taking conversational commerce beyond the bot

APIs and microservices extend conversational commerce beyond chatbots to the Internet of Things, in-store digital installations, smart speakers, wearable devices and more. These new “heads” can access any commerce service required, with business logic tuned to what makes most sense for every device -- even those which are fully voice-enabled and lack a screen or graphic user interface.

How headless commerce supports native chatbots

Consider your chatbot as a new touchpoint with its own business logic and interface. You want to pull data from certain commerce services (such as catalog, personalization engines, search, accounts, site content like FAQs and Shipping policies, order management, pricing, promotions or even cart and checkout) -- but use this data in a way that suits the conversational format.

If your ecommerce platform is headless, your chatbot can connect to any or all of these services through a robust API, with business logic configured and tailored to the chatbot within the API layer without changing code in the commerce platform itself. This ensures that adding a chatbot won’t interfere with your online store experience and won’t require complicated development.

If you’re using headless commerce and microservices architecture, the services within your ecommerce platform are independent of each other. This adds an additional benefit, as you can compose a chatbot experience even quicker, connecting to only the services you want (versus the entire back end if you’re using a monolithic headless platform).

To learn more about how headless commerce and microservices support native chatbots, get in touch or check out our documentation and chatbot reference experience.

 

 

 

Author: Linda Bustos
Posted: May 14, 2020, 9:41 pm

We've all heard online shoppers will abandon a page that doesn’t load in “X seconds or less.” But in-store shoppers get antsy too -- physical retailers lose nearly $40 billion in potential sales each year from patrons who won’t wait longer than five minutes in a checkout line.

Note that this statistic dropped before the COVID-19 outbreak. For the stores allowed to remain open during this pandemic, queues have only become more painful, with customers facing lineups to get in and to check out.

While consumers are arguably more tolerant of slow lines under present conditions, physical distancing poses more checkout challenges. Many merchants refuse to handle reusable bags or physically touch cash and customer credit cards, interacting with customers from behind a plastic shield. Some self-checkout stations are closed off to preserve the 6-foot spread, and touchscreens pose their own threat.

Mobile self-checkout: a promising solution

With 63% of consumers already using their mobile devices to find product information and reviews in-store, supporting mobile self-checkout is a natural extension of the in-store digital experience. Pre-pandemic, 40% of consumers said they were more likely to shop at a store offering mobile self-checkout, and 63% of retailers were planning to support the ability to use a customer-owned mobile device as a point-of-sale checkout by 2022.

We can expect current behaviors to remain even once "non-essential" brick and mortar shops reopen. Conditioned by weeks to months of "new normal" retail experiences, consumers will be more conscious of social distancing and protective measures. We anticipate an even higher percentage of shoppers and retailers to embrace the concept of self-checkout through a personal device.

Mobile self-checkout is already here - with a problem...

Early adopters like Walmart, U-Haul, IKEA, 7-11 and Macy's have already launched scan-and-go applications. The problem is they're native applications requiring shoppers do download an app (friction) on iOS or Android (exclusion).

Eliminating self-checkout friction with headless commerce and PWAs

For a self-checkout solution to gain traction and make an impact on wait times and safety, it should be universally accessible to all customers with a smartphone. Building the feature into a progressive web app (PWA) enables access through any browser (and simplifies development and maintenance to boot).

Stance Socks is one of the first brands to leverage PWAs for mobile self-checkout.

How Stance uses microservices for mobile self-checkout

With Stance's mobile self-checkout solution, shoppers scan barcodes with their cameras to add items to a digital shopping bag as they fill a physical bag (colored differently from regular checkout bagging).

Integrating seamlessly with Stance’s in-store POS, Stripe payments stack and digital commerce engine, the mobile self-checkout API pulls product data, pricing and payment services into the web app to create the cart, update contents and processes payment.

Customers can pay by credit card, Apple Pay or GPay. Says Stance’s EVP of Direct-to-Consumer, Paul Zaengle:

“ApplePay and GPay are the fastest for consumers – they can easily get through a 2-3 item purchase in under a minute. A credit card takes slightly longer, as it adds 21 keystrokes, but is still pretty quick using an integration with Stripe.”

On a shopper's way out, an associate quickly scans a digital receipt displayed on the phone screen to match it to bagged goods. Each digital purchase receipt is unique and time-and-date stamped with a color-changing bar tracking the number of items to discourage theft.

Ringing in results

Zaegle reports mobile self-checkout accounts for 15% of sales daily “with minimal in-store signage and marketing so far,” and that customers who use it once typically use it again. “I know this number will increase as we get into (the) holiday season – as soon as there is a line, adoption will go up, as customers begin to see the utility and convenience of it.”

Stance’s mobile self-checkout also boasts a 91% email capture rate -- more than double the capture through its retail POS.

“I’m bullish on self-checkout. It combines the richness of a physical retail store visit with the convenience of eCommerce; it provides a sweet spot to give our guests the best experience possible. As more retailers implement solutions like this, it will become a consumer expectation.”

Rapid rollout

With microservices, extending mobile self-checkout to the physical store didn’t require a full rip-and-replace platforming project, nor did it require an overhaul of Stance’s retail POS systems. Using only a small team of developers, mobile self-checkout went live in less than seven weeks in its pilot store, with subsequent locations deploying in under two weeks per store.

Tying the threads

Data collected through mobile self-checkout connects with personalization and subscription services. This enables cross-channel behavior and purchase trends to be applied to recommendations segmented to a visitor’s geolocation, including affinities to college and sports teams, artists, influencers, designs, styles and more.

Although today many enterprise ecommerce platforms have gone headless, most stop short at decoupling the front end from the back end commerce platform. While such decoupled architecture supports CMS and DXP-led commerce, SPAs and PWAs, achieving true commerce anywhere requires modular, portable capabilities.

Rapidly roll out mobile self-checkout with microservices

Mobile self-checkout is one way to adapt the in-store experience to a mobile-first and post-pandemic world. And it's possible to get a "pocket point-of-sale" solution up and running in advance of stores reopening.

Regardless of your legacy platform, you can be self-checkout ready in as little as two weeks with Elastic Path's mobile self-checkout reference experience.

Author: Linda Bustos
Posted: May 14, 2020, 7:55 pm

This year at WWDC 2019, Apple announced ARKit 3 and RealityKit which bring additional capabilities to the already growing augmented reality (AR) framework. Last year at WWDC 2018, Apple’s ARKit 2 introduced Quick Look features over the web for iOS 12, which allows 3D rendered models to be viewed and interacted with in the real world directly through Safari on iOS. This is achieved by making use of the native platform technologies for supported browsers — think PWA features, making use of available browser APIs. As of this year, Google has recently announced similar ARCore features to be made available over the web in Android Q, which is currently pending public release. Having already implemented ARkit over the web using ARKit 2, I felt it’s time to get the word out there on the true capabilities of these features. Now without further adieu, we’ll dive into the details on how we incorporated ARkit’s Quick Look features into our PWA.

Reference Experience

As of October 25 2018, the React PWA (Progressive Web App) Reference Storefront, incorporates ARKit2’s Quick Look functionality over the web to allow shoppers the ability to view 3D models of products they may browse in the storefront.

Let’s venture into how we achieved our ARKit implementation through our Reference Storefront PWA. Our intention was to give shoppers of our storefront the option of an AR experience of a product when they visit a product’s display page that supports the functionality.

Firstly, creation of augmented reality requires knowledge of 3D modelling or at least a tool to help with the creation of models. Models can be created from images, from scratch, or from templates available on the web. We used Vectary (https://vectary.com) for our implementation, which is a subscription based 3D modelling web application however you can use any modelling application of your choice.

Second, ARKit mandates usdz as the file format for rendering 3D models. This was easy, since Vectary can export our models to a variety of file types (.obj, .usdz, etc.). If you’re not using Vectary, Apple has created a tool to help with this generation/conversion here: https://developer.apple.com/augmented-reality/quick-look/.

Lastly, we had to host our usdz files somewhere so they can be picked up by our Reference Storefront application. ARKit usdz files are externalized through content URLs within the storefront application configuration. The default URLs are configured to reference the usdz files, which are located on Amazon S3 however, any other CMS (Content Management System) provider may be used. Note that depending on the content hosting system you’re using to deliver your usdz files, you may need to configure CORS to allow HEAD requests.

We added some configuration properties for our project to use:

"arKit": { "enable": true, "skuArImagesUrl": "https://s3.amazonaws.com/referenceexp/ar/%sku%.usdz" },

 

arKit.enable: Enable ARKit’s Quick Look capability to load on a product display page.

  • arKit.skuArImagesUrl: The path to the usdz files hosted on an external CMS. Set this parameter to the complete URL of the files by replacing the sku/file-name parameter with %sku%. This parameter is populated when the page is loaded with values retrieved by Cortex

We were able to create all the necessary implementations around this strictly within our React component responsible for loading the product information on the product details page (productdisplayitem.main.jsx).

To start, we have to determine if a corresponding usdz file for our product is hosted on our configured content delivery provider. We’ll add an initial check to see if the “arKit” configuration has been set, and that the ar tag is supported by our browser.

After validating that the ARkit can be supported on the current browser, we then check for the corresponding usdz file by initiating a HEAD request to the usdz file location, and update our components state to indicate whether or not the file exists. We execute this whenever our component mounts, or when it receives new properties. We surround our request in a function urlExists() to simplify our calls.

urlExists(url, callback) { this.funcName = 'UrlExists'; fetch(url, { method: 'HEAD' }).then((res) => { callback(res.ok); }); } ... if (Config.arKit.enable && document.createElement('a').relList.supports('ar')) { this.urlExists(Config.arKit.skuArImagesUrl.replace('%sku%', res._code[0].code), (exists) => { this.setState({ productData: res, arFileExists: exists, }); }); } ...

Next, we’ll need to render our product image to display the usdz file we fetched (if the file exists). Upon rendering our component we’ll wrap our product image in an anchor tag with rel=”ar” if the usdz file exists. This will position the ARKit Quick Look icon directly around our product image, so that tapping it will direct the shopper into Quick Look with our 3D object file.

if (arFileExists) { return ( <a href={Config.arKit.skuArImagesUrl.replace('%sku%', productData._code[0].code)} rel="ar"> <img src={Config.skuImagesUrl.replace('%sku%', productData._code[0].code)} onError={(e) => { e.target.src = imgPlaceholder; }} alt={intl.get('none-available')} className="itemdetail-main-img" /> </a> ); }

When the image is called to the storefront, the required usdz files are retrieved on a per-SKU basis as they are available from the CMS provider. The storefront only displays the required AR tags, if the file exists. Any SKUs without a corresponding usdz file will not have an AR tag displayed on the product display page.

 

A quick demonstration.

Summary

 

We chose to implement augmented reality into our PWA using this solution as it was (and presently still is) the most viable to implement for a general practice. The implementation solely relies on features at the browser level, without any need for integrating with custom libraries/dependencies. This also makes the ARKit implementation a standard to conform to in order to ensure future features that can be added through this implementation are easily adapted. Google has proceeded to follow a similar pattern by introducing their own ARCore technologies over the web, which is expected to be made available with Android Q.

 

By following these general practices for making AR available over the web, we have ensured our PWA can accommodate features that may further enhance the experiences of augmented reality. This opens us up to consume some of the features in ARKit 3 and RealityKit such as people occlusion and facial tracking.

 

In the current state of this implementation, usdz files must be created on a case-by-case basis for each product in a catalog to which ARKit is desired to be supported for. Depending on the size of the catalog, modelling these products as 3D models may take quite a bit of effort and time. To alleviate this effort, we would further investigate dynamically rendering object files from product photos, or integrating with a 3D object modelling service. These would be common hurdles that need to be accomplished by any means, as generating the necessary 3D models still require some sort of 3D modelling experience — for now.

Author: Shaun Maharaj
Posted: May 13, 2020, 5:21 pm

Introduction

IoT voice interaction was once the stuff of Sci-Fi movies but now, many of us no longer bat an eye. From computers to phones to digital assistants, talking to a device has gone from a futuristic dream to an in-our-homes reality. And it appears this field has only begun to scratch the surface of its widespread potential. According to a recent OC&C Strategy Consultant Study, voice shopping could surpass $40 billion across the US and UK by 2022 (up from $2 billion today).

For Elastic Path’s recent Hackdays, our team looked at voice-enabled commerce powered by Cortex. Specifically, we focused on enabling expert users to interact as they normally would when placing complex orders, such as coffee orders. We wanted them to talk to the system rather than through the traditional digital interactions.

While Cortex ran the commerce side of things, we used Google’s Dialogflow to handle voice recognition and created a small NodeJS server to glue it all together. The conceptual secret sauce though, was a context-driven approach, complementing catalog-driven language processing.

Voice Commerce Google Assistant AWS Illustration

Context Matters

Behind the words, buying things in real life is quite complicated. When a customer says, “I’d like a triple shot espresso, please,” the underlying concepts at play, translated for a commerce system, include the desire for the item, the item variety itself, the intent to order the item, and a desire (or willingness) to pay.

“I’d like a triple espresso, please” <==> “I desire the espresso product, but I want it of the triple shot variety. Also, I’d like to order this configured item and I am ready to pay for it.”

{ "queryResult": { "queryText": "I’d like a triple espresso, please", "parameters": { "number": "", "size": "triple", "product": "expresso-bundle" }, "allRequiredParamsPresent": true, "fulfillmentText": "Okay, so you want triple espresso. Would you like to pay?", "fulfillmentMessages": [ { "text": { "text": [ "Okay, so you want triple espresso. Would you like to pay?" ] } } ], "outputContexts": [ { "lifespanCount": 2, "parameters": { "number.original": "", "product.original": "espresso", "size.original": "triple", "number": "", "size": "triple", "product": "expresso-bundle" } } ], "intent": { "displayName": "I want" }, "intentDetectionConfidence": 0.8966336, "diagnosticInfo": { "webhook_latency_ms": 30 }, "languageCode": "en" } }

This is one of the key challenges for eCommerce voice interactions: context sensitivity. The ability to recognize key points from a single command makes transactions smoother, encouraging adoption, reducing friction, and allowing voice interactions to mimic real-world experiences. For a commerce system, “context” roughly translates to “what else” or the “next actions”. This just so happens to be Cortex’s specialty.

From Context to Commerce: Cortex Zooms to Next Actions

When you ask for an espresso, the set of underlying requirements include identifying the product, ordering, and paying. Cortex, with its flexibility in presenting the client with next actions (adhering to the best practices of a mature REST Level 3 API), provides zoom parameters to link between desired actions (Cortex documentation available here).

Search for the ‘espresso’ product ==> include the ‘triple shot’ option ==> add it to my order ==> purchase the order

This string of actions fulfills a happy path model for ordering an espresso and the general actions (“find a product and add it to the cart”) are naturally supported by Cortex. However, a critical piece in providing flexible interactions is the ability to configure products and their add-ons on-the-fly, creating bundled products which dynamically reflect changing prices and options. Furthermore, we need to provide this functionality in a way that is predictable and consistent enough to establish a programmatic pattern (i.e. a determined chain of resource calls/zooms that we can use for any queried product), but flexible enough to support different kinds of configurations (additional shots, drink sizes, etc.). To accomplish this, we used a customized implementation for Dynamic Bundles in our APIs, which provided support for selecting from a list of bundle constituent options and dynamically adjusting corresponding products. Using this accelerator in concert with out-of-the-box Cortex endpoints provided dynamic product configuration, but within a predictable pattern for adding all desired options and accessing “next actions”.

Given this translation from context to commerce, the next challenge is recognizing context in voice commands.

The Gift of Gab: Natural Language Processing

Many large technology companies offer NLP services to extract intents and details. We chose Google’s Dialogflow over Facebook’s Wit.ai and IBM’s Watson due to its ease-of-testing, development, and extensibility. While both Wit.ai and Watson offer powerful language processing features, Dialogflow’s detailed feedback, deep community support, and streamlined connectivity with Android devices supported our rapid development and eventual demos with minimal additional configuration.

Dialogflow uses “intents” and “entities” to decipher and tag input text. From a high level, intents describe the goal of the input — this aligns very closely with the idea of “context”. By implementing an “I want” intent and training the NLP model with sentences that implied this resolve, we connected the context with various input possibilities. This provided programmatic contextualization of voice input. Training the model can be done through the Dialogflow GUI by providing sample inputs and assigning them to an intent. As the model receives additional input, it becomes smarter and more accurate about recognizing past, as well as novel, similar inputs. Behind the scenes, these intents and their trained inputs are represented as JSON (and may even be uploaded in a similar manner). Below is a JSON sample extracted from the “I want” intent’s list of trained inputs. This input associates the sentence “I want a triple espresso” with the desired intent, tagging the various pieces of the sentence.

{ "data": [ { "text": "i want ", "userDefined": false }, { "text": "one", "alias": "number", "meta": "@sys.number", "userDefined": false }, { "text": " ", "userDefined": false }, { "text": "triple", "alias": "size", "meta": "@size", "userDefined": false }, { "text": " ", "userDefined": false }, { "text": "espresso", "alias": "product", "meta": "@order", "userDefined": false } ], "isTemplate": false, "count": 0, "updated": 0, "isAuto": false },

Further, orders are rarely simple and recognizing variations on an order requires not only context understanding, but also detail recognition and relevancy knowledge. Product variations, like extra shots, different sizes, milk varieties, etc., require the NLP system to know which details to flag. These dynamic pieces of the input commands constitute “entities”, which are also defined through the Dialogflow GUI and associated with appropriate intents. This is where the key details of a catalog come into play. With manually-imported catalog data and specified, corresponding configuration options, Dialogflow learned to parse specific products and variations from inputs, providing this information in the JSON output, as well. In the example above, we see the system tagging things like “size” and “product” — these are predefined entities associated with the “I want” intent. Hence, when we provide training input that resolves to this intent, the system picks up the related, expected entities and validates these for increased specificity and accuracy going forward.

Dialogflow also provides tools for testing new inputs and visualizing this output in JSON. After training the model, providing the input “I’d like a triple espresso, please” provides (sample) output.

This provides the necessary structural predictability, allowing us to consume, tag, and decipher vocal input data. We then used Dialogflow’s Fulfillment module to pass these details on to our NodeJS server, which parsed this data and kicked off the expected Cortex flow to fulfill these desires.

Taking this a step further, Dialogflow allows users to import detail recognition knowledge (i.e. entity definitions) as JSON data. For example, the following provides a snippet of the JSON definition for a “size” entity.

{ "id": "123", "name": "size", "isOverridable": true, "entries": [ { "value": "doppio", "synonyms": [ "doppio", "double" ] }, { "value": "grande", "synonyms": [ "grande", "large" ] }, { "value": "quad", "synonyms": [ "quad", "quadruple" ] }, { "value": "short", "synonyms": [ "short", "small" ] }, { "value": "tall", "synonyms": [ "regular", "tall" ] }, { "value": "triple", "synonyms": [ "triple" ] }, { "value": "venti", "synonyms": [ "vendi", "venti" ] } ], "isEnum": false, "automatedExpansion": true, "allowFuzzyExtraction": false, "isRegexp": false }

Given this capability, a user could group catalog-based add-ons under specific entities to provide automated and dynamic catalog-driven language processing. For example, if espresso products are linked to a set of SKU options relating to size, we can write a script that parses this source data (in our case, an XML file) and outputs a JSON entity definition for “size”, assigning the parsed SKU options as entity “values” and “synonyms”.

<product> <code>espresso-code</code> ... <availability> <storevisible>true</storevisible> <availabilityrule>ALWAYS_AVAILABLE</availabilityrule> ... </availability> <attributes> ... </attributes> <skus> <sku guid="double"> <code>double_espresso_sku</code> <skuoptions> <skuoption code="size"> <skuoptionvalue>double</skuoptionvalue> </skuoption> </skuoptions> ... </sku> <sku guid="triple"> <code>triple_espresso_sku</code> <skuoptions> <skuoption code="size"> <skuoptionvalue>triple</skuoptionvalue> </skuoption> </skuoptions> ... </sku> <skus> </product> { "id": "123", "name": "size", "isOverridable": true, "entries": [ { "value": "double", "synonyms": [ "double" ] }, { "value": "triple", "synonyms": [ "triple" ] } ], }

What It All Means

Context-aware voice commerce fills a clear role in an increasingly voice-enabled digital world. By combining natural language processing with the flexible commerce functionality of Cortex, we smoothly gleaned contextual details and leveraged natural voice commands into frictionless buying experiences for customers. Looking forward, context-awareness and catalog-driven language processing may empower complex purchases across industries, while the natural interactions driving these actions further bridge the gap between real-world interactions and streamlined digital commerce.

Author: Wes Berry
Posted: May 12, 2020, 10:30 pm

Online merchants have embraced experience-driven commerce for over a decade, using headless commerce and APIs to allow content management systems (CMS) and digital experience platforms (DXP) to power the front end, rather than be restricted to the features and functionality offered out-of-the-box with commercial commerce platforms.

Why use a CMS with your headless commerce platform?

In addition to more robust content capabilities, CMS-powered commerce:

  • Supports rich, interactive experiences that showcase the brand
  • Unifies websites -- brands no longer need to use a sub-domain for the “e-store” that requires its own systems (with the cost, maintenance and user friction a separate e-store entails)
  • Enables business users to make content updates independent of IT and schedule cutovers
  • Enables back-end developers to make updates without worrying if it might break something “up front”
  • May extend content to flexible, mobile-first front end frameworks such as React.js, Vue.js, Ember.js et al for Single Page Applications (SPAs) and Progressive Web Apps (PWAs) -- depending on your environment

Since the rise of content-driven commerce, both commerce and content management platforms have continued to evolve. Today’s modern commerce platforms go beyond simply decoupling the front end to enable CMS-driven commerce to embrace multi-touchpoint extensibility and support any touchpoint -- and to make life easier for developers. Content management solutions have gone headless for the same reasons.

Decoupled versus headless CMS

Like legacy ecommerce platforms, content management systems were traditionally developed and deployed as monoliths with all components tightly coupled within a single code base, making development, scalability and extensibility challenging and complex. And as with digital commerce, such CMSes were built primarily for the Web, before the advent of mobile, wearables, IoT, voice and other modern touchpoints.

Decoupled content management

Many CMS vendors have taken the first step towards headless architecture by decoupling the front end from the back end content repository, thus separating content creation from delivery. With a decoupled CMS, the head is provided with the solution, but its use is optional. The CMS’ API can connect to any front end in place of or alongside the application’s head.

This decoupling enables content to be deployed to multiple front end environments, and also allows a website to be redesigned or significantly updated without reimplementing or restarting the CMS.

True headless content management

True headless content management applications don’t provide any head, assuming the organization using the CMS wants to deploy content to their own, customized heads with the flexibility to swap-in and swap-out best-of-breed solutions over time.

But the key differentiator between decoupled and headless content management is flexibility and control. True headless allows custom content and experiences to be served from the same content hub to multiple front ends and devices, tailored to their context. Content assets can be remixed and recomposed based on what makes the most sense for the experience and device form factors.

For example, mobile apps can have their own look and feel, show or hide custom content, reformat content or support their own user journeys. Or, touch screen applications can show selective content, formatted for their location, purpose and screen resolution. Similarly, you can reuse a buy button, form, banner or any design element across any experience.

It’s not uncommon for an enterprise to run multiple, siloed CMSes to cover their channels and touchpoints. With headless content management, virtually any experience can be powered from a single CMS without restrictions or sacrificing custom content to satisfy all channels.

Better together: how headless commerce + content management support the agile enterprise

Deliver truly unique and contextual experiences across touchpoints

Enterprises that embrace headless commerce and microservices to quickly spin up new and innovative experiences shouldn’t be held back by monolithic content platforms! Headless content-plus-content ensures omnitouchpoint experiences are optimized for both form and function.

Support greenfield projects without a legacy rip-and-replace

The best part of an agile technology environment is the ability to launch new projects -- even experimental or greenfield projects -- without investing in separate, siloed platforms, introducing risk into the legacy environment, or taking months or years to deploy.

Future-proof your environment

API-driven solutions provide the flexibility to swap-in and swap-out best of breed solutions. For headless commerce, today this means leveraging lightning-fast front end frameworks like React.js and its cousins to serve mobile-friendly single page applications (SPAs) and progressive web apps (PWAs). As new technologies and devices come down the pike, they can simply bolt-on to what you’ve built on your back end.

When headless content management isn’t right for you

As with headless commerce, headless content isn’t right for every organization. Headless CMS is best for teams with seasoned developers, the appetite to deploy unique experiences across touchpoints, and the need for agility.

However, in today’s mobile-first, multi-touchpoint world, a decoupled CMS at minimum is suitable for most enterprises and ensures modern front ends such as SPAs and PWAs are supported.

 

Author: Linda Bustos
Posted: May 12, 2020, 9:41 pm