Tuesday, 28 May 2019

An approach to enhance machine learning explanations

Researchers at IBM Research U.K., the U.S. Military Academy and Cardiff University have recently proposed a technique they call Local Interpretable Model Agnostic Explanations (LIME) for attaining a better understanding of the conclusions reached by machine learning algorithms. Their paper, published on SPIE digital library, could inform the development of artificial intelligence (AI) tools that provide exhaustive explanations of how they reached a particular outcome or conclusion.

* This article was originally published here

Lights in the sky from Elon Musk's new satellite network have stargazers worried

UFOs over Cairns. Lights over Leiden. Glints above Seattle. What's going on?

* This article was originally published here

Licorice tea causes hypertensive emergency in patient

Licorice tea, a popular herbal tea, is not without health risks, as a case study of a man admitted to hospital for a high-blood pressure emergency demonstrates in CMAJ (Canadian Medical Association Journal).

* This article was originally published here