{"id":807,"date":"2022-10-11T12:44:11","date_gmt":"2022-10-11T10:44:11","guid":{"rendered":"https:\/\/nail.cs.ut.ee\/?post_type=team&#038;p=807"},"modified":"2023-01-30T11:45:46","modified_gmt":"2023-01-30T09:45:46","slug":"marharyta","status":"publish","type":"team","link":"https:\/\/nail.cs.ut.ee\/index.php\/team\/marharyta\/","title":{"rendered":"Marharyta Domnich"},"content":{"rendered":"\n<div class = \"one_half \">\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"368\" height=\"368\" src=\"https:\/\/nail.cs.ut.ee\/wp-content\/uploads\/2023\/01\/marharytamain.png\" alt=\"\" class=\"wp-image-798\" srcset=\"https:\/\/nail.cs.ut.ee\/wp-content\/uploads\/2023\/01\/marharytamain.png 368w, https:\/\/nail.cs.ut.ee\/wp-content\/uploads\/2023\/01\/marharytamain-300x300.png 300w, https:\/\/nail.cs.ut.ee\/wp-content\/uploads\/2023\/01\/marharytamain-150x150.png 150w\" sizes=\"auto, (max-width: 368px) 100vw, 368px\" \/><\/figure>\n\n\n\n<\/div>\n\n\n\n<div class = \"one_half last \">\n\n\n\n<div class=\"info-text\">\n\n\n\n<p>Marharyta Domnich<\/p>\n\n\n\n<\/div>\n\n\n\n<p>Generating explanations from AI models is essential to build trust in the model decision, especially for domains such as healthcare finance, and criminal justice, where the consequences of incorrect decisions are severe. Explanations can help to identify problems or biases in the model and help the model debugging directly.<\/p>\n\n\n\n<p>Marharyta is participating in a <a href=\"http:\/\/www.trustai.eu\/\">TRUST-AI<\/a> project funded by the EU that aims to build a Transparent, Reliable and Unbiased Smart Tool that produces model decisions together with interactive explanations. Together, they are building a platform with explainable decisions that serve healthcare, online retail and energy cases.<\/p>\n\n\n\n<p><a href=\"mailto:marharyta.domnich@ut.ee\">marharyta.domnich@ut.ee<\/a><\/p>\n\n\n\n<p><div class=\"sonar-wrapper\"><div class=\"sonar-emitter\"><div class=\"sonar-wave\"><\/div><\/div>Explainable AI and Counterfactual Explanations<\/div><\/p>\n\n\n\n<\/div><div class = \"clear\"><\/div>\n","protected":false},"featured_media":795,"template":"","meta":[],"class_list":["post-807","team","type-team","status-publish","has-post-thumbnail","hentry"],"_links":{"self":[{"href":"https:\/\/nail.cs.ut.ee\/index.php\/wp-json\/wp\/v2\/team\/807","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/nail.cs.ut.ee\/index.php\/wp-json\/wp\/v2\/team"}],"about":[{"href":"https:\/\/nail.cs.ut.ee\/index.php\/wp-json\/wp\/v2\/types\/team"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/nail.cs.ut.ee\/index.php\/wp-json\/wp\/v2\/media\/795"}],"wp:attachment":[{"href":"https:\/\/nail.cs.ut.ee\/index.php\/wp-json\/wp\/v2\/media?parent=807"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}