Adding Alexa to our BTicino system

As mentioned in my previous posts on home automation, adding voice assistant capability to our BTicino MyHome automation system was one of my goals for its modernization. We already have been using Amazon’s Alexa for some time to control the Sonos sound systems we have in most of our rooms, and therefore Alexa also was the obvious choice for adding voice control to the BTicino system.

While our MyHomeServer1 from BTicino integrates with Alexa, it does so only for newer BTicino components and most of our F411/2 actuators are not supported. But openHAB, which we just added to our setup for integrating wireless components from Shelly, also supports Alexa and I therefore choose to go that route.

So in this post I’ll describe how I set up the integration between Alexa, openHAB and our BTicino system, in the hope it may help others facing the same task.

Let’s start with an overview of how the setup looks like. Our Alexa-enabled devices, mostly Sonos and one Echo Dot, issue their voice commands to Amazon’s Alexa service (1), which forwards them to openHAB via the openHAB Cloud (2, 3). OpenHAB processes those commands and triggers our BTicino and Shelly actuators or scenarios in the MH202 scenario programmer (4).

The steps required to implement this setup are:

  • Connect openHAB to the openHAB Cloud.
  • Install the Alexa skill for openHAB.
  • Setup openHAB for processing Alexa commands for single devices, groups of devices or scenarios.

Especially the first two steps where pretty straight-forwards, and I’ll just link to the documentation I followed. For the last step I will dive a bit deeper into the implementation, especially how to call up MH202 scenarios from openHAB.

Connect OpenHAB to the Cloud

As first step, you need to connect your local openHAB instance to an instance of the openHAB Cloud, which implements features like remote access, push notifications and integration with other cloud services such as Alexa. The openHAB Foundation operates a free-to-use instance at

The openHAB Cloud Connector page provides instructions for connecting your openHAB instance, and it only takes a few minutes to install the connector and creating an account with your local instance’s UUID and Secret.

Install Alexa Skill

Next, logon to your Alexa app for iOS or Android and install the openHAB skill. You will be asked to sign-in with the account you created in the previous step.

Control of Single Devices or Groups of Devices

OpenHAB’s Amazon Alexa Smart Home Skill page provides detailed instructions how to add metadata to your Items in openHAB to enable them to be controlled by Alexa. This works both for single devices and also Groups, such as Locations. Basically you assign a device type to each Item that defines its features and capabilities. The name of the device that is used in your voice commands is either derived from its Label, or you can change it in the Alexa metadata or later in the Alexa app once it has been added there. Since the Items are independent of the underlying technology, these steps work the same for our BTicino and Shelly components. The screenshot below shows the Alexa metadata page for an Item.

Once the metadata has been added you can run a discovery of new devices from Alexa via voice command: “Alexa, discover devices”. Alexa should then add the newly found devices or groups and you’ll also be able to access them via the app to organize them in rooms or modify their properties. The screenshot below from the iOS app shows a light that has been added and can now be controlled from Alexa:

Control of Scenarios

In addition to the control of devices managed by openHAB, I also wanted to enable Alexa to call up scenarios managed by our BTicino MH202 scenario programmer. In my previous article on the integration between BTicino and openHAB I covered only switching of devices controlled by BTicino from openHAB but not calling up scenarios. Therefore we’ll now dive a bit deeper into this topic and the method described here is not only useful for integration with Alexa, but any other use case where you’d like openHAB to call up a scenario in BTicino.

Activating a BTicino Scenario from OpenHAB

For the activation of scenarios in the MH202 you’d typically use CEN or CEN+ messages. In our case I decided on using CEN messages, since some of our older controls only support CEN and not CEN+. Also while the MH202 can be triggered by CEN+ messages it itself can only send CEN messages.

The OpenWebNet binding supports sending out CEN/CEN+ messages, but it requires scripting and thus is a bit less straight-forward then receiving them.

Let’s assume you already have setup your scenarios in the MH202 and have assigned scenario addresses to them, for example as shown in the table below. When using CEN, scenarios are addressed by an A/PL address and a virtual button number and you can address up to 32 scenarios via virtual buttons within one A/PL address.

Then, first, you need to setup a Thing with the scenario addresses you are using. If you have already setup a Thing to receive those messages and process them in openHAB you can reuse that same Thing for sending. The screenshot below shows the Thing I am using for both receiving and sending of CEN messages for scenarios:

For sending out CEN messages from openHAB you need a small piece of code, which you can include in your rules. The Scenario channels section of the binding documentation has some more information on that.  I am using the JavaScript add-on for scripting, and the JavaScript syntax of the command looks like this, with <thing_uid> to be replaced by the UID of the thing and <virtualbutton> by the virtual button number for the scenario you want to call up.

var cen = actions.get( “openwebnet”, “<thing_uid>” );
cen.virtualPress( “START_PRESS”, <virtualbutton> );

For reusability, I have created a non-semantic Item “BT_Scenarios” of type Number and a rule that reacts to commands to that Item and activates the scenario with the number sent as command. That way other rules can activate scenarios without any scripting, as we will see later on.

Here is how the Item “BT_Scenarios” looks like:

And here the rule and script that reacts to state changes in the item and sends a CEN message with the new state as virtual button:

Once this is in place we can start adding rules to connect Alexa with the BTicino scenarios.

Activating the Scenario from Alexa

As next step we need to create an Item for each of the scenario we want to make accessible to Alexa. Those are non-semantic Items that are handled separately from the physical Locations of the semantic model.

I have been using Alexa device types Scene for these Items, with the openHAB type Switch.

With the Alexa metadata set, Alexa can now discover and control those Items in the same manner as described for devices further above. The screenshot below shows an Item with Alexa device type Scene in the Alexa iOS app:

However, on their own, these new Items are not doing anything yet, we now need them to create rules for triggering scenarios in the MH202. These rules trigger on changes to the state of the Items and then send a command to the “BT_Scenarios” Item we have previously created.

However, there is one more thing to take care off, which is the state of Item.

Stateless MH202 Scenarios vs. Stateful Alexa Scenes

Scenarios in the MH202 are stateless, i.e. they do not have an on or off state. Lights in Alexa are always stateful, i.e. on or off and Scenes can be either, in their openHAB Alexa metadata you can define whether a scene can be only activated or also deactivated.  However, for a scenario that is actually intended to turn off devices, that leads to counterintuitive voice commands, because you’d have to tell Alexa to turn the scene on instead of off.

The way I have handled this is:

  • In some cases there are two MH202 scenarios that counter each other, these can then be implemented as one Scene for Alexa that can be activated and deactivated.
  • For MH202 scenarios that turn devices on, I use Alexa Scenes without deactivation.
  • For MH202 scenarios that turn devices off, I use Alexa Scenes with deactivation but no action on activation.

Since the Item representing the Alexa Scene in openHAB is of type Switch, it is also always stateful. A simple solution for the last two cases above is to amend the rule to reset the state immediately after execution. Below some screenshots showing how to do that. A more elaborate solution would be to have openHAB actually detect if any of the devices that is part of the scenario has changed its state and then reset the state of the Alexa Scene Item, but it depends on your individual scenario if and how you are able to do that.

And now that we have tied everything together, you can start triggering your MH202 scenarios with voice commands to Alexa!

Final Words

Since we already had Alexa in place, it was the natural choice to add capability to control our devices and scenarios, and I am quite happy that openHAB enabled us to do so where BTicino themselves failed due to their lack of support of older components.

The integration between openHAB and Alexa was simple to set up and works well. Calling up MH202 scenarios from openHAB was a little bit tricky but also not too difficult to figure out, partly again thanks to the help from the folks in the openHAB community.

As always, I hope you find the information in this post useful and please don’t hesitate to contact me with any questions, comments or suggestions.