Creating a single perceptron for trainingHow do I merge two dictionaries in a single expression?Single quotes vs. double quotes in PythonHow can I safely create a nested directory?What is the meaning of a single and a double underscore before an object name?Create a dictionary with list comprehensionHow do I create a constant in Python?Way to create multiline comments in Python?Pythonic way to create a long multi-line stringPython: Removing a range of numbers from array listNeural network online training
What is a Thanos Word™?
CO₂ level is high enough that it reduces cognitive ability. Isn't that a reason to worry?
Dicht antonym - what is it?
Is it a good idea to contact a candidate?
Pi to the power y, for small y's
Is it possible to keep cat litter on balcony during winter (down to -10°C)
Possible executive assistant job scam
How to use FDE without needing to share the encryption password
Short story/novella about old-school Biblical angels wrecking the world
Is it sportsmanlike to waste opponents' time by giving check at the end of the game?
Is a grommet needed for romex into this metal junction box?
Electrical connection during car jump start
What's an "add" chord?
Can you make monkeys human?
How to interpret Residuals vs. Fitted Plot
Translation Golf XLIX - An Accurate Shot
Can I request a credit item be removed from my report as soon as it is paid in full?
An historical mystery : Poincaré’s silence on Lebesgue integral and measure theory?
A shoe in the safe
A variation on Caesar
What do you call someone whose unmarried partner has died?
Decision problems for which it is unknown whether they are decidable
Passport expiration requirement for Jordan Visa
How does the sorcerer's Careful Spell Metamagic option work with the Thunderwave spell?
Creating a single perceptron for training
How do I merge two dictionaries in a single expression?Single quotes vs. double quotes in PythonHow can I safely create a nested directory?What is the meaning of a single and a double underscore before an object name?Create a dictionary with list comprehensionHow do I create a constant in Python?Way to create multiline comments in Python?Pythonic way to create a long multi-line stringPython: Removing a range of numbers from array listNeural network online training
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;
learning how perceptron works and attempted to created a function out of it.
I recently watched a video in youtube as an introduction to the said topic.
Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:
# x1 x2 y
data = [ [3.5, 1.5, 1],
[2.0, 1.0, 0],
[4.0, 1.5, 1],
[3.0, 1.0, 0],
[3.5, 0.5, 1],
[2.0, 0.5, 0],
[5.5, 1.0, 1],
[1.0, 1.0, 0],
[4.5, 1.0, 1] ]
data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])
Sigmoid function:
def sigmoid(x):
x = 1 / (1 + np.exp(-x))
return x
Perceptron function:
w1 = np.random.randn()
w2 = np.random.randn()
b = np.random.randn()
def perceptron(x1,x2, w1, w2, b):
z = (w1 * x1) + (w2 * x2) + b
return sigmoid(z)
My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?
def get_cost_slope(b,a):
"""
b = predicted value
a = actual value
"""
sqrerror = (b - a) ** 2
slope = 2 * (b-a)
return sqrerror, slope
python python-3.x pandas neural-network jupyter-notebook
add a comment
|
learning how perceptron works and attempted to created a function out of it.
I recently watched a video in youtube as an introduction to the said topic.
Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:
# x1 x2 y
data = [ [3.5, 1.5, 1],
[2.0, 1.0, 0],
[4.0, 1.5, 1],
[3.0, 1.0, 0],
[3.5, 0.5, 1],
[2.0, 0.5, 0],
[5.5, 1.0, 1],
[1.0, 1.0, 0],
[4.5, 1.0, 1] ]
data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])
Sigmoid function:
def sigmoid(x):
x = 1 / (1 + np.exp(-x))
return x
Perceptron function:
w1 = np.random.randn()
w2 = np.random.randn()
b = np.random.randn()
def perceptron(x1,x2, w1, w2, b):
z = (w1 * x1) + (w2 * x2) + b
return sigmoid(z)
My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?
def get_cost_slope(b,a):
"""
b = predicted value
a = actual value
"""
sqrerror = (b - a) ** 2
slope = 2 * (b-a)
return sqrerror, slope
python python-3.x pandas neural-network jupyter-notebook
add a comment
|
learning how perceptron works and attempted to created a function out of it.
I recently watched a video in youtube as an introduction to the said topic.
Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:
# x1 x2 y
data = [ [3.5, 1.5, 1],
[2.0, 1.0, 0],
[4.0, 1.5, 1],
[3.0, 1.0, 0],
[3.5, 0.5, 1],
[2.0, 0.5, 0],
[5.5, 1.0, 1],
[1.0, 1.0, 0],
[4.5, 1.0, 1] ]
data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])
Sigmoid function:
def sigmoid(x):
x = 1 / (1 + np.exp(-x))
return x
Perceptron function:
w1 = np.random.randn()
w2 = np.random.randn()
b = np.random.randn()
def perceptron(x1,x2, w1, w2, b):
z = (w1 * x1) + (w2 * x2) + b
return sigmoid(z)
My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?
def get_cost_slope(b,a):
"""
b = predicted value
a = actual value
"""
sqrerror = (b - a) ** 2
slope = 2 * (b-a)
return sqrerror, slope
python python-3.x pandas neural-network jupyter-notebook
learning how perceptron works and attempted to created a function out of it.
I recently watched a video in youtube as an introduction to the said topic.
Right now, I tried to mimic his function and I would like to try applying it in a sample dataset:
# x1 x2 y
data = [ [3.5, 1.5, 1],
[2.0, 1.0, 0],
[4.0, 1.5, 1],
[3.0, 1.0, 0],
[3.5, 0.5, 1],
[2.0, 0.5, 0],
[5.5, 1.0, 1],
[1.0, 1.0, 0],
[4.5, 1.0, 1] ]
data = pd.DataFrame(data, columns = ["Length", "Width", "Class"])
Sigmoid function:
def sigmoid(x):
x = 1 / (1 + np.exp(-x))
return x
Perceptron function:
w1 = np.random.randn()
w2 = np.random.randn()
b = np.random.randn()
def perceptron(x1,x2, w1, w2, b):
z = (w1 * x1) + (w2 * x2) + b
return sigmoid(z)
My question here is how can I add the cost function inside the Perceptron and loop it n times based from a parameter to adjust the weights using the cost function?
def get_cost_slope(b,a):
"""
b = predicted value
a = actual value
"""
sqrerror = (b - a) ** 2
slope = 2 * (b-a)
return sqrerror, slope
python python-3.x pandas neural-network jupyter-notebook
python python-3.x pandas neural-network jupyter-notebook
edited Mar 28 at 22:03
Sid
asked Mar 28 at 21:52
SidSid
434 bronze badges
434 bronze badges
add a comment
|
add a comment
|
1 Answer
1
active
oldest
votes
You need to create a method which would backpropagate through the perceptron and optimize the weights.
def optimize( a , b ):
sqrerror = (b - a) ** 2
cost_deriv = 2 * (b-a)
sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function
learning_rate = 0.001 # Used to scale the gradients
w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
b -= ( cost_deriv * sigmoid_deriv ) * learning_rate
Since ,
Where $J$ is the cost function.
Awesome! That explains a lot :) thank you!
– Sid
Mar 29 at 11:02
add a comment
|
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407411%2fcreating-a-single-perceptron-for-training%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You need to create a method which would backpropagate through the perceptron and optimize the weights.
def optimize( a , b ):
sqrerror = (b - a) ** 2
cost_deriv = 2 * (b-a)
sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function
learning_rate = 0.001 # Used to scale the gradients
w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
b -= ( cost_deriv * sigmoid_deriv ) * learning_rate
Since ,
Where $J$ is the cost function.
Awesome! That explains a lot :) thank you!
– Sid
Mar 29 at 11:02
add a comment
|
You need to create a method which would backpropagate through the perceptron and optimize the weights.
def optimize( a , b ):
sqrerror = (b - a) ** 2
cost_deriv = 2 * (b-a)
sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function
learning_rate = 0.001 # Used to scale the gradients
w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
b -= ( cost_deriv * sigmoid_deriv ) * learning_rate
Since ,
Where $J$ is the cost function.
Awesome! That explains a lot :) thank you!
– Sid
Mar 29 at 11:02
add a comment
|
You need to create a method which would backpropagate through the perceptron and optimize the weights.
def optimize( a , b ):
sqrerror = (b - a) ** 2
cost_deriv = 2 * (b-a)
sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function
learning_rate = 0.001 # Used to scale the gradients
w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
b -= ( cost_deriv * sigmoid_deriv ) * learning_rate
Since ,
Where $J$ is the cost function.
You need to create a method which would backpropagate through the perceptron and optimize the weights.
def optimize( a , b ):
sqrerror = (b - a) ** 2
cost_deriv = 2 * (b-a)
sigmoid_deriv = z * ( 1 - z ) # derivative of sigmoid function
learning_rate = 0.001 # Used to scale the gradients
w1 -= ( cost_deriv * sigmoid_deriv * x1 ) * learning_rate # Gradient Descent update rule
w2 -= ( cost_deriv * sigmoid_deriv * x2 ) * learning_rate
b -= ( cost_deriv * sigmoid_deriv ) * learning_rate
Since ,
Where $J$ is the cost function.
answered Mar 29 at 4:02
Shubham PanchalShubham Panchal
1,7622 gold badges2 silver badges17 bronze badges
1,7622 gold badges2 silver badges17 bronze badges
Awesome! That explains a lot :) thank you!
– Sid
Mar 29 at 11:02
add a comment
|
Awesome! That explains a lot :) thank you!
– Sid
Mar 29 at 11:02
Awesome! That explains a lot :) thank you!
– Sid
Mar 29 at 11:02
Awesome! That explains a lot :) thank you!
– Sid
Mar 29 at 11:02
add a comment
|
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407411%2fcreating-a-single-perceptron-for-training%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown