This post was written by
Geoff Bessin

Introducing IntuiFace Version 6.0

Combining cutting edge features with the power of machine learning, IntuiFace Version 6 puts limitless expression and assisted design in the hands of the creative in all of us.
Introducing IntuiFace Version 6.0

As the innovation leader for no-coding interactive design creation, IntuiFace has a responsibility to define the future, and not just chase it. At IntuiLab, we see the future bringing together ever more powerful expressive capabilities with machine learning. The goal is incorporate crowd sourced and best practice insight and apply it in an intelligent way.

IntuiFace Version 6 is a major step towards realization of this commitment, embracing our no-coding mantra to combine increasingly powerful design features with guidance from an intelligent system that learns best practices based on individual and community contributions. Yes, it's as cool as it sounds.

We've recorded a live demo of all the key features. You can see it here:

 

So how does IntuiFace Version 6 push the envelope?

  • API Explorer
    Enter any REST-based Web Service query and our API Explorer will automatically display that service's response and then use a machine learning (ML) engine to preselect which content to display within IntuiFace. By changing those selections, you teach the ML engine about your preferences. Finally, a separate machine learning engine generates a visual of the selected content within your experience, a visual that is dynamically connected to the Web Service. When the service is refreshed, so is IntuiFace. All this without writing one line of code! Patent pending stuff.
  • Experience LayersTriggers and Background
    Experience layers are similar to the concept of a master template in slideware but much more powerful. They are optionally universal to all scenes and remain onscreen throughout scene to scene navigation so - for example - you can play an audio file without interruption during navigation. Experiences themselves can now host a background and have their own associated triggers, making it easy to centralize key design aspects across all scenes.
  • Design Assistant
    Every time you add content linked to dynamic data sources (for example coming from the API Explorer), IntuiFace will use a machine learning (ML) engine to scan best practices and suggest optimal layout options. Your choices will "train" this ML engine, helping it to understand your preferences, improving the quality of its suggestions. There is also a quick access option within Composer for changing collection styles on the fly.
  • X-Ray Panel
    Easily inspect all properties, triggers and actions exposed by services accessed by IntuiFace through an external API. Applies to REST-based Web Services, .NET dlls and JavaScript programs. Plus, through drag-and-drop, create visual assets automatically bound to these any editable or read-only properties.
  • Share and Deploy Console
    We've completely overhauled Management Console, the Web-based tool used to share published experiences and to deploy those experiences to devices anywhere in the world. Super fast and super scalable to handle thousands of devices, it's what our Enterprise customers have been asking for.

We're not done. IntuiFace is now available on Windows with a 64-bit architecture, we've simplified support for any external trigger source like RFID/NFC readers and we've released a brand new packaging, pricing and licensing model. Check out the release notes for all of the details.

Remember how I mentioned above that Version 6 is a major step? It is, but that implies we're not done - and we aren't. This really is just the beginning.

Geoff Bessin
Geoff Bessin

Officially, IntuiLab's marketing chief. Unofficially, IntuiFace's #1 fan

Guest Blogging
Interested in writing for our blog? Click the button below:
Write for us!
All Articles

What's IntuiFace?

The world's premier platform for creating, deploying and measuring interactive digital experiences without writing one line of code.