What loss function to use when labels are probabilities? Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?Why would neural networks be a particularly good framework for “embodied AI”?Understanding GAN Loss functionHelp with implementing Q-learning for a feedfoward network playing a video gameHow do I implement softmax forward propagation and backpropagation to replace sigmoid in a neural network?Gradient of hinge loss functionHow to understand marginal loglikelihood objective function as loss function (explanation of an article)?What is batch / batch size in neural networks?Comparing and studying Loss FunctionsLoss function spikesPredicting sine using LSTM: Small output range and delayed output?

What is the order of Mitzvot in Rambam's Sefer Hamitzvot?

Replacing HDD with SSD; what about non-APFS/APFS?

Single author papers against my advisor's will?

What do you call the holes in a flute?

Can a non-EU citizen traveling with me come with me through the EU passport line?

Simulating Exploding Dice

If I can make up priors, why can't I make up posteriors?

Can a zero nonce be safely used with AES-GCM if the key is random and never used again?

Blender game recording at the wrong time

Can the prologue be the backstory of your main character?

How to politely respond to generic emails requesting a PhD/job in my lab? Without wasting too much time

Why is there no army of Iron-Mans in the MCU?

How can I make names more distinctive without making them longer?

Is there a documented rationale why the House Ways and Means chairman can demand tax info?

What do you call a plan that's an alternative plan in case your initial plan fails?

Is drag coefficient lowest at zero angle of attack?

Can smartphones with the same camera sensor have different image quality?

3 doors, three guards, one stone

How many things? AとBがふたつ

Strange behaviour of Check

Why use gamma over alpha radiation?

90's book, teen horror

Windows 10: How to Lock (not sleep) laptop on lid close?

What loss function to use when labels are probabilities?



What loss function to use when labels are probabilities?



Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)
Announcing the arrival of Valued Associate #679: Cesar Manara
Unicorn Meta Zoo #1: Why another podcast?Why would neural networks be a particularly good framework for “embodied AI”?Understanding GAN Loss functionHelp with implementing Q-learning for a feedfoward network playing a video gameHow do I implement softmax forward propagation and backpropagation to replace sigmoid in a neural network?Gradient of hinge loss functionHow to understand marginal loglikelihood objective function as loss function (explanation of an article)?What is batch / batch size in neural networks?Comparing and studying Loss FunctionsLoss function spikesPredicting sine using LSTM: Small output range and delayed output?



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








2












$begingroup$


What loss function is most appropriate when training a model with target values that are probabilities? For example, I have a 3-output model with x=[some features] and y=[0.2, 0.3, 0.5].



It seems like something like cross-entropy doesn't make sense here since it assumes that a single target is the correct label.



Would something like MSE (after applying softmax) make sense, or is there a better loss function?










share|improve this question







New contributor




Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$


















    2












    $begingroup$


    What loss function is most appropriate when training a model with target values that are probabilities? For example, I have a 3-output model with x=[some features] and y=[0.2, 0.3, 0.5].



    It seems like something like cross-entropy doesn't make sense here since it assumes that a single target is the correct label.



    Would something like MSE (after applying softmax) make sense, or is there a better loss function?










    share|improve this question







    New contributor




    Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.







    $endgroup$














      2












      2








      2





      $begingroup$


      What loss function is most appropriate when training a model with target values that are probabilities? For example, I have a 3-output model with x=[some features] and y=[0.2, 0.3, 0.5].



      It seems like something like cross-entropy doesn't make sense here since it assumes that a single target is the correct label.



      Would something like MSE (after applying softmax) make sense, or is there a better loss function?










      share|improve this question







      New contributor




      Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.







      $endgroup$




      What loss function is most appropriate when training a model with target values that are probabilities? For example, I have a 3-output model with x=[some features] and y=[0.2, 0.3, 0.5].



      It seems like something like cross-entropy doesn't make sense here since it assumes that a single target is the correct label.



      Would something like MSE (after applying softmax) make sense, or is there a better loss function?







      neural-networks loss-functions probability-distribution






      share|improve this question







      New contributor




      Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.











      share|improve this question







      New contributor




      Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|improve this question




      share|improve this question






      New contributor




      Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      asked 7 hours ago









      Thomas JohnsonThomas Johnson

      1133




      1133




      New contributor




      Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      Thomas Johnson is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




















          1 Answer
          1






          active

          oldest

          votes


















          3












          $begingroup$

          Actually, the cross-entropy loss function would be appropriate here, since it measures the "distance" between a distribution $q$ and the "true" distribution $p$.



          You are right, though, that using a loss function called "cross_entropy" in many APIs would be a mistake. This is because these functions, as you said, assume a one-hot label. You would need to use the general cross-entropy function,



          $$H(p,q)=-sum_xin X p(x) log q(x).$$
          $ $



          Note that one-hot labels would mean that
          $$
          p(x) =
          begincases
          1 & textif x text is the true label\
          0 & textotherwise
          endcases$$



          which causes the cross-entropy $H(p,q)$ to reduce to the form you're familiar with:



          $$H(p,q) = -log q(x_label)$$






          share|improve this answer









          $endgroup$













            Your Answer








            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "658"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );






            Thomas Johnson is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f11816%2fwhat-loss-function-to-use-when-labels-are-probabilities%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            3












            $begingroup$

            Actually, the cross-entropy loss function would be appropriate here, since it measures the "distance" between a distribution $q$ and the "true" distribution $p$.



            You are right, though, that using a loss function called "cross_entropy" in many APIs would be a mistake. This is because these functions, as you said, assume a one-hot label. You would need to use the general cross-entropy function,



            $$H(p,q)=-sum_xin X p(x) log q(x).$$
            $ $



            Note that one-hot labels would mean that
            $$
            p(x) =
            begincases
            1 & textif x text is the true label\
            0 & textotherwise
            endcases$$



            which causes the cross-entropy $H(p,q)$ to reduce to the form you're familiar with:



            $$H(p,q) = -log q(x_label)$$






            share|improve this answer









            $endgroup$

















              3












              $begingroup$

              Actually, the cross-entropy loss function would be appropriate here, since it measures the "distance" between a distribution $q$ and the "true" distribution $p$.



              You are right, though, that using a loss function called "cross_entropy" in many APIs would be a mistake. This is because these functions, as you said, assume a one-hot label. You would need to use the general cross-entropy function,



              $$H(p,q)=-sum_xin X p(x) log q(x).$$
              $ $



              Note that one-hot labels would mean that
              $$
              p(x) =
              begincases
              1 & textif x text is the true label\
              0 & textotherwise
              endcases$$



              which causes the cross-entropy $H(p,q)$ to reduce to the form you're familiar with:



              $$H(p,q) = -log q(x_label)$$






              share|improve this answer









              $endgroup$















                3












                3








                3





                $begingroup$

                Actually, the cross-entropy loss function would be appropriate here, since it measures the "distance" between a distribution $q$ and the "true" distribution $p$.



                You are right, though, that using a loss function called "cross_entropy" in many APIs would be a mistake. This is because these functions, as you said, assume a one-hot label. You would need to use the general cross-entropy function,



                $$H(p,q)=-sum_xin X p(x) log q(x).$$
                $ $



                Note that one-hot labels would mean that
                $$
                p(x) =
                begincases
                1 & textif x text is the true label\
                0 & textotherwise
                endcases$$



                which causes the cross-entropy $H(p,q)$ to reduce to the form you're familiar with:



                $$H(p,q) = -log q(x_label)$$






                share|improve this answer









                $endgroup$



                Actually, the cross-entropy loss function would be appropriate here, since it measures the "distance" between a distribution $q$ and the "true" distribution $p$.



                You are right, though, that using a loss function called "cross_entropy" in many APIs would be a mistake. This is because these functions, as you said, assume a one-hot label. You would need to use the general cross-entropy function,



                $$H(p,q)=-sum_xin X p(x) log q(x).$$
                $ $



                Note that one-hot labels would mean that
                $$
                p(x) =
                begincases
                1 & textif x text is the true label\
                0 & textotherwise
                endcases$$



                which causes the cross-entropy $H(p,q)$ to reduce to the form you're familiar with:



                $$H(p,q) = -log q(x_label)$$







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered 6 hours ago









                Philip RaeisghasemPhilip Raeisghasem

                988119




                988119




















                    Thomas Johnson is a new contributor. Be nice, and check out our Code of Conduct.









                    draft saved

                    draft discarded


















                    Thomas Johnson is a new contributor. Be nice, and check out our Code of Conduct.












                    Thomas Johnson is a new contributor. Be nice, and check out our Code of Conduct.











                    Thomas Johnson is a new contributor. Be nice, and check out our Code of Conduct.














                    Thanks for contributing an answer to Artificial Intelligence Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f11816%2fwhat-loss-function-to-use-when-labels-are-probabilities%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Era Viking Índice Início da Era Viquingue | Cotidiano | Sociedade | Língua | Religião | A arte | As primeiras cidades | As viagens dos viquingues | Viquingues do Oeste e Leste | Fim da Era Viquingue | Fontes históricas | Referências Bibliografia | Ligações externas | Menu de navegação«Sverige då!»«Handel I vikingetid»«O que é Nórdico Antigo»Mito, magia e religião na volsunga saga Um olhar sobre a trajetória mítica do herói sigurd«Bonden var den verklige vikingen»«Vikingatiden»«Vikingatiden»«Vinland»«Guerreiras de Óðinn: As Valkyrjor na Mitologia Viking»1519-9053«Esculpindo símbolos e seres: A arte viking em pedras rúnicas»1679-9313Historia - Tema: VikingarnaAventura e Magia no Mundo das Sagas IslandesasEra Vikinge

                    What's the metal clinking sound at the end of credits in Avengers: Endgame?What makes Thanos so strong in Avengers: Endgame?Who is the character that appears at the end of Endgame?What happens to Mjolnir (Thor's hammer) at the end of Endgame?The People's Ages in Avengers: EndgameWhat did Nebula do in Avengers: Endgame?Messing with time in the Avengers: Endgame climaxAvengers: Endgame timelineWhat are the time-travel rules in Avengers Endgame?Why use this song in Avengers: Endgame Opening Logo Sequence?Peggy's age in Avengers Endgame

                    Mortes em março de 2019 Referências Menu de navegação«Zhores Alferov, Nobel de Física bielorrusso, morre aos 88 anos - Ciência»«Fallece Rafael Torija, o bispo emérito de Ciudad Real»«Peter Hurford dies at 88»«Keith Flint, vocalista do The Prodigy, morre aos 49 anos»«Luke Perry, ator de 'Barrados no baile' e 'Riverdale', morre aos 52 anos»«Former Rangers and Scotland captain Eric Caldow dies, aged 84»«Morreu, aos 61 anos, a antiga lenda do wrestling King Kong Bundy»«Fallece el actor y director teatral Abraham Stavans»«In Memoriam Guillaume Faye»«Sidney Sheinberg, a Force Behind Universal and Spielberg, Is Dead at 84»«Carmine Persico, Colombo Crime Family Boss, Is Dead at 85»«Dirigent Michael Gielen gestorben»«Ciclista tricampeã mundial e prata na Rio 2016 é encontrada morta em casa aos 23 anos»«Pagan Community Notes: Raven Grimassi dies, Indianapolis pop-up event cancelled, Circle Sanctuary announces new podcast, and more!»«Hal Blaine, Wrecking Crew Drummer, Dies at 90»«Morre Coutinho, que editou dupla lendária com Pelé no Santos»«Cantor Demétrius, ídolo da Jovem Guarda, morre em SP»«Ex-presidente do Vasco, Eurico Miranda morre no Rio de Janeiro»«Bronze no Mundial de basquete de 1971, Laís Elena morre aos 76 anos»«Diretor de Corridas da F1, Charlie Whiting morre aos 66 anos às vésperas do GP da Austrália»«Morreu o cardeal Danneels, da Bélgica»«Morreu o cartoonista Augusto Cid»«Morreu a atriz Maria Isabel de Lizandra, de "Vale Tudo" e novelas da Tupi»«WS Merwin, prize-winning poet of nature, dies at 91»«Atriz Márcia Real morre em São Paulo aos 88 anos»«Mauritanie: décès de l'ancien président Mohamed Mahmoud ould Louly»«Morreu Dick Dale, o rei da surf guitar e de "Pulp Fiction"»«Falleció Víctor Genes»«João Carlos Marinho, autor de 'O Gênio do Crime', morre em SP»«Legendary Horror Director and SFX Artist John Carl Buechler Dies at 66»«Morre em Salvador a religiosa Makota Valdina»«مرگ بازیکن‌ سابق نساجی بر اثر سقوط سنگ در مازندران»«Domingos Oliveira morre no Rio»«Morre Airton Ravagniani, ex-São Paulo, Fla, Vasco, Grêmio e Sport - Notícias»«Morre o escritor Flavio Moreira da Costa»«Larry Cohen, Writer-Director of 'It's Alive' and 'Hell Up in Harlem,' Dies at 77»«Scott Walker, experimental singer-songwriter, dead at 76»«Joseph Pilato, Day of the Dead Star and Horror Favorite, Dies at 70»«Sheffield United set to pay tribute to legendary goalkeeper Ted Burgin who has died at 91»«Morre Rafael Henzel, sobrevivente de acidente aéreo da Chapecoense»«Morre Valery Bykovsky, um dos primeiros cosmonautas da União Soviética»«Agnès Varda, cineasta da Nouvelle Vague, morre aos 90 anos»«Agnès Varda, cineasta francesa, morre aos 90 anos»«Tania Mallet, James Bond Actress and Helen Mirren's Cousin, Dies at 77»e