Creating a single perceptron for trainingHow do I merge two dictionaries in a single expression?Single quotes vs. double quotes in PythonHow can I safely create a nested directory?What is the meaning of a single and a double underscore before an object name?Create a dictionary with list comprehensionHow do I create a constant in Python?Way to create multiline comments in Python?Pythonic way to create a long multi-line stringPython: Removing a range of numbers from array listNeural network online training

What is a Thanos Word™?

CO₂ level is high enough that it reduces cognitive ability. Isn't that a reason to worry?

Dicht antonym - what is it?

Is it a good idea to contact a candidate?

Pi to the power y, for small y's

Is it possible to keep cat litter on balcony during winter (down to -10°C)

Possible executive assistant job scam

How to use FDE without needing to share the encryption password

Short story/novella about old-school Biblical angels wrecking the world

Is it sportsmanlike to waste opponents' time by giving check at the end of the game?

Is a grommet needed for romex into this metal junction box?

Electrical connection during car jump start

What's an "add" chord?

Can you make monkeys human?

How to interpret Residuals vs. Fitted Plot

Translation Golf XLIX - An Accurate Shot

Can I request a credit item be removed from my report as soon as it is paid in full?

An historical mystery : Poincaré’s silence on Lebesgue integral and measure theory?

A shoe in the safe

A variation on Caesar

What do you call someone whose unmarried partner has died?

Decision problems for which it is unknown whether they are decidable

Passport expiration requirement for Jordan Visa

How does the sorcerer's Careful Spell Metamagic option work with the Thunderwave spell?



Creating a single perceptron for training


How do I merge two dictionaries in a single expression?Single quotes vs. double quotes in PythonHow can I safely create a nested directory?What is the meaning of a single and a double underscore before an object name?Create a dictionary with list comprehensionHow do I create a constant in Python?Way to create multiline comments in Python?Pythonic way to create a long multi-line stringPython: Removing a range of numbers from array listNeural network online training






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;









5

















learning how perceptron works and attempted to created a function out of it.



I recently watched a video in youtube as an introduction to the said topic.



Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:



# x1 x2 y
data = [ [3.5, 1.5, 1],
[2.0, 1.0, 0],
[4.0, 1.5, 1],
[3.0, 1.0, 0],
[3.5, 0.5, 1],
[2.0, 0.5, 0],
[5.5, 1.0, 1],
[1.0, 1.0, 0],
[4.5, 1.0, 1] ]

data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])


Sigmoid function:



def sigmoid(x):
x = 1 / (1 + np.exp(-x))
return x


Perceptron function:



w1 = np.random.randn()
w2 = np.random.randn()
b = np.random.randn()

def perceptron(x1,x2, w1, w2, b):

z = (w1 * x1) + (w2 * x2) + b

return sigmoid(z)


My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?



def get_cost_slope(b,a):
"""
b = predicted value
a = actual value
"""

sqrerror = (b - a) ** 2
slope = 2 * (b-a)

return sqrerror, slope









share|improve this question


































    5

















    learning how perceptron works and attempted to created a function out of it.



    I recently watched a video in youtube as an introduction to the said topic.



    Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:



    # x1 x2 y
    data = [ [3.5, 1.5, 1],
    [2.0, 1.0, 0],
    [4.0, 1.5, 1],
    [3.0, 1.0, 0],
    [3.5, 0.5, 1],
    [2.0, 0.5, 0],
    [5.5, 1.0, 1],
    [1.0, 1.0, 0],
    [4.5, 1.0, 1] ]

    data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])


    Sigmoid function:



    def sigmoid(x):
    x = 1 / (1 + np.exp(-x))
    return x


    Perceptron function:



    w1 = np.random.randn()
    w2 = np.random.randn()
    b = np.random.randn()

    def perceptron(x1,x2, w1, w2, b):

    z = (w1 * x1) + (w2 * x2) + b

    return sigmoid(z)


    My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?



    def get_cost_slope(b,a):
    """
    b = predicted value
    a = actual value
    """

    sqrerror = (b - a) ** 2
    slope = 2 * (b-a)

    return sqrerror, slope









    share|improve this question






























      5












      5








      5








      learning how perceptron works and attempted to created a function out of it.



      I recently watched a video in youtube as an introduction to the said topic.



      Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:



      # x1 x2 y
      data = [ [3.5, 1.5, 1],
      [2.0, 1.0, 0],
      [4.0, 1.5, 1],
      [3.0, 1.0, 0],
      [3.5, 0.5, 1],
      [2.0, 0.5, 0],
      [5.5, 1.0, 1],
      [1.0, 1.0, 0],
      [4.5, 1.0, 1] ]

      data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])


      Sigmoid function:



      def sigmoid(x):
      x = 1 / (1 + np.exp(-x))
      return x


      Perceptron function:



      w1 = np.random.randn()
      w2 = np.random.randn()
      b = np.random.randn()

      def perceptron(x1,x2, w1, w2, b):

      z = (w1 * x1) + (w2 * x2) + b

      return sigmoid(z)


      My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?



      def get_cost_slope(b,a):
      """
      b = predicted value
      a = actual value
      """

      sqrerror = (b - a) ** 2
      slope = 2 * (b-a)

      return sqrerror, slope









      share|improve this question

















      learning how perceptron works and attempted to created a function out of it.



      I recently watched a video in youtube as an introduction to the said topic.



      Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:



      # x1 x2 y
      data = [ [3.5, 1.5, 1],
      [2.0, 1.0, 0],
      [4.0, 1.5, 1],
      [3.0, 1.0, 0],
      [3.5, 0.5, 1],
      [2.0, 0.5, 0],
      [5.5, 1.0, 1],
      [1.0, 1.0, 0],
      [4.5, 1.0, 1] ]

      data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])


      Sigmoid function:



      def sigmoid(x):
      x = 1 / (1 + np.exp(-x))
      return x


      Perceptron function:



      w1 = np.random.randn()
      w2 = np.random.randn()
      b = np.random.randn()

      def perceptron(x1,x2, w1, w2, b):

      z = (w1 * x1) + (w2 * x2) + b

      return sigmoid(z)


      My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?



      def get_cost_slope(b,a):
      """
      b = predicted value
      a = actual value
      """

      sqrerror = (b - a) ** 2
      slope = 2 * (b-a)

      return sqrerror, slope






      python python-3.x pandas neural-network jupyter-notebook






      share|improve this question
















      share|improve this question













      share|improve this question




      share|improve this question








      edited Mar 28 at 22:03







      Sid

















      asked Mar 28 at 21:52









      SidSid

      434 bronze badges




      434 bronze badges

























          1 Answer
          1






          active

          oldest

          votes


















          2


















          You need to create a method which would backpropagate through the perceptron and optimize the weights.



          def optimize( a , b ):

          sqrerror = (b - a) ** 2
          cost_deriv = 2 * (b-a)

          sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function

          learning_rate = 0.001 # Used to scale the gradients

          w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
          w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
          b -= ( cost_deriv * sigmoid_deriv ) * learning_rate


          Since ,



          Partial Derivative



          Where $J$ is the cost function.






          share|improve this answer


























          • Awesome! That explains a lot :) thank you!

            – Sid
            Mar 29 at 11:02












          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );














          draft saved

          draft discarded
















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407411%2fcreating-a-single-perceptron-for-training%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown


























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2


















          You need to create a method which would backpropagate through the perceptron and optimize the weights.



          def optimize( a , b ):

          sqrerror = (b - a) ** 2
          cost_deriv = 2 * (b-a)

          sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function

          learning_rate = 0.001 # Used to scale the gradients

          w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
          w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
          b -= ( cost_deriv * sigmoid_deriv ) * learning_rate


          Since ,



          Partial Derivative



          Where $J$ is the cost function.






          share|improve this answer


























          • Awesome! That explains a lot :) thank you!

            – Sid
            Mar 29 at 11:02















          2


















          You need to create a method which would backpropagate through the perceptron and optimize the weights.



          def optimize( a , b ):

          sqrerror = (b - a) ** 2
          cost_deriv = 2 * (b-a)

          sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function

          learning_rate = 0.001 # Used to scale the gradients

          w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
          w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
          b -= ( cost_deriv * sigmoid_deriv ) * learning_rate


          Since ,



          Partial Derivative



          Where $J$ is the cost function.






          share|improve this answer


























          • Awesome! That explains a lot :) thank you!

            – Sid
            Mar 29 at 11:02













          2














          2










          2









          You need to create a method which would backpropagate through the perceptron and optimize the weights.



          def optimize( a , b ):

          sqrerror = (b - a) ** 2
          cost_deriv = 2 * (b-a)

          sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function

          learning_rate = 0.001 # Used to scale the gradients

          w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
          w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
          b -= ( cost_deriv * sigmoid_deriv ) * learning_rate


          Since ,



          Partial Derivative



          Where $J$ is the cost function.






          share|improve this answer














          You need to create a method which would backpropagate through the perceptron and optimize the weights.



          def optimize( a , b ):

          sqrerror = (b - a) ** 2
          cost_deriv = 2 * (b-a)

          sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function

          learning_rate = 0.001 # Used to scale the gradients

          w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
          w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
          b -= ( cost_deriv * sigmoid_deriv ) * learning_rate


          Since ,



          Partial Derivative



          Where $J$ is the cost function.







          share|improve this answer













          share|improve this answer




          share|improve this answer










          answered Mar 29 at 4:02









          Shubham PanchalShubham Panchal

          1,7622 gold badges2 silver badges17 bronze badges




          1,7622 gold badges2 silver badges17 bronze badges















          • Awesome! That explains a lot :) thank you!

            – Sid
            Mar 29 at 11:02

















          • Awesome! That explains a lot :) thank you!

            – Sid
            Mar 29 at 11:02
















          Awesome! That explains a lot :) thank you!

          – Sid
          Mar 29 at 11:02





          Awesome! That explains a lot :) thank you!

          – Sid
          Mar 29 at 11:02




















          draft saved

          draft discarded















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407411%2fcreating-a-single-perceptron-for-training%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown









          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

          은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현