We already have the new version of Home Assistant 2024.6 available, this version incorporates quite interesting new features if you have an integrated smart speaker, and if you usually use Assist on a regular basis. Now we can integrate OpenAI’s AI and also Google’s AI to give it intelligence. We will also have very interesting changes and developments for the Lovelace menus, both for the new sections that were released months ago, as well as in the cards that we have used since the beginning. Do you want to know all the news of this new version?
In this June update we have a quite interesting evolution of Assist and voice, and the developers of the project already said that this year would be the year of voice. In addition, we also have interesting new features in the graphical user interface part, as well as new integrations available.
More AI in Assist
The Home Assistant development team now allows us to integrate OpenAI AI and also Google Generative AI so that it controls our home. When we give it instructions through voice or through Assist, it will understand what is being asked of it. and it will execute different orders that we want. Logically, we must previously have the corresponding integrations registered, give them a descriptive name and also an area, so that everything works properly. Now when we register an AI, it will be able to access all the Home Assistant entities we want, in order to control certain aspects of our home. Conversation agents like Google Gemini integrate perfectly, and allow you to do everything that Assist can do but with greater intelligence.
Adding an AI like Google’s is very simple, we simply go to the “Settings / Devices and services” section, add the integration by searching for “Google” and choose “Google Generative AI.” We’ll have to create an API key, but it’s pretty easy to do.
Once added, we can use Assist with the intelligence provided by Google’s AI.
For example, if we have a lamp called “lamp” and it is displayed in the “master bedroom” area, we will simply have to give the following direct command: turn on the lamp in the main room. And the order will automatically be executed, this worked before with Assist without the AI, but now we can tell it things like: I’m going to sleep in the main room, turn off the light. AI will give you a much more “human” language, and not with so much order.
Other improvements in the voice are that now we can execute voice commands to control multimedia players in the same area where the Assist speaker is, if we say “pause” it will pause the multimedia player that we have located in the area where said speaker is from Assist. Now this has been greatly simplified to not send such long orders.
Customization of panels, cards and sections
The development team now allows you to upload a photo to use as wallpaper in the different Lovelace panels. This is not something new because before it was also possible, but it had to be done manually, uploading the photo to the internal files via Samba, configuring a YAML file with the full path, etc. Now we can simply upload the photo through the settings menu, or define a path where the image is.
If you use “Sections” that are still experimental, you can configure each section to have visibility based on one or more conditions. In order for it to be displayed, it is absolutely necessary that these conditions be met, otherwise it will not be displayed. The truth is that the configurability that this option provides us is very high, perfect for adapting it to our needs.
This same functionality is also available on the cards, in the “Visibility” tab we can see all the new configuration options. Before it was only possible to do this based on users, or based on complex conditions, now it is much easier to display this information and even set complex conditions.
Other options that have been added are that, in the automations, scripts and entities section, we can collapse all the categories or expand them, these two buttons have been added. Of course, then we can expand only the categories that we want.
As you can see, we have important improvements in the graphical user interface, little by little they are improving it to add new features and make life easier for the user.
Other changes and improvements
In this new version Home Assistant 2024.6 we have new integrations, such as Airgradient devices, APsystems microinverters, we can now forward system events to the Azure data exploiter, as well as compatibility with Lora devices, support for PIR and battery sensors has been added to Reolink brand integrations, as well as support for SwitchBot Meter, MeterPlus and Outdoor Meter devices.
This version that came out today is not a revolution, but it is an evolution that allows us to continue improving the user experience.