imgZine’s revamped news algorithms create a more personal reading experience

Hester Gras

We’re very proud to announce that imgZine has launched a new version of the recommender engine. The engine looks at the reading behaviour of our end-users and in this way is able to quickly provide them with relevant articles and other articles that might be of interest. So, what’s new?

Lately, we optimised the algorithm that decides which news article is relevant for which reader. The previous version of the recommender engine would always recommend 25 relevant articles to any user. This means a user got to see 25 articles that he/she would view or read. However, the new algorithm focuses on articles that will be read. For instance, if the recommender engine thinks a user will only read three articles, then it will only recommend three. In addition, the algorithm is self-learning. The more people will use their app, the better the article recommendations will be. So our apps might offer fewer articles, but the articles have become more relevant!

Why personalising the reading experience?

The recommender engine is a response to the information overload that we face today. In a previous blog post on the usability of the intranet vs. internal communications apps, we showed that information overload is the biggest cause of employees’ negative associations with the intranet. Marijn deurloo, CEO imgZine:

“We’ve created more information in the last 10 years than in all of human history. All of this is more than the brain is configured to handle. Ask people how they are doing these days, and the first word you will hear is busy. We do more, go to more places, and buy more. As time becomes our most scarce asset, limiting this flow to relevant information will increasingly become important. Complex machine learning algorithms will play an important role in our lives, as we train them with our behaviour.”

 

Sign up for our newsletter

Error: Contact form not found.

Let's talk

We can demonstrate
the power and ease
of our platform.