Skip to content

Commit

Permalink
Update gh-pages to output generated at 843f699
Browse files Browse the repository at this point in the history
  • Loading branch information
marcolivierarsenault committed Nov 3, 2024
1 parent 47a86c1 commit 694f593
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 6 deletions.
4 changes: 2 additions & 2 deletions feed.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@
<description>Data Blog by Marc-Olivier Arsenault</description>
<link>https://coffeeanddata.ca/</link>
<atom:link href="https://coffeeanddata.ca/feed.xml" rel="self" type="application/rss+xml"/>
<pubDate>Wed, 25 Sep 2024 01:59:41 +0000</pubDate>
<lastBuildDate>Wed, 25 Sep 2024 01:59:41 +0000</lastBuildDate>
<pubDate>Sun, 03 Nov 2024 17:56:13 +0000</pubDate>
<lastBuildDate>Sun, 03 Nov 2024 17:56:13 +0000</lastBuildDate>
<generator>Jekyll v4.3.4</generator>

<item>
Expand Down
8 changes: 4 additions & 4 deletions lossless-triplet-loss.html
Original file line number Diff line number Diff line change
Expand Up @@ -139,7 +139,7 @@ <h2 id="the-problem">THE PROBLEM</h2>
loss = K.maximum(basic_loss,0.0)

return loss

def create_base_network(in_dims, out_dims):
"""
Base network to be shared.
Expand All @@ -155,7 +155,7 @@ <h2 id="the-problem">THE PROBLEM</h2>
model.add(BatchNormalization())

return model

in_dims = (N_MINS, n_feat)
out_dims = N_FACTORS

Expand Down Expand Up @@ -240,7 +240,7 @@ <h2 id="the-problem">THE PROBLEM</h2>

<h2 id="other-losses">OTHER LOSSES</h2>

<p>Another famous loss function the contrastive loss describe by Yan LeCun and his team in their paper <a href="http://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf">Dimensionality Reduction by Learning an Invariant Mapping</a> is also maxing the negative result, which creates the same issue.</p>
<p>Another famous loss function the contrastive loss describe by Yan LeCun and his team in their paper <a href="https://yann.lecun.com/exdb/publis/pdf/hadsell-chopra-lecun-06.pdf">Dimensionality Reduction by Learning an Invariant Mapping</a> is also maxing the negative result, which creates the same issue.</p>

<figure>
<img src="assets/images/posts/20180215/lecunFormula.png#center" alt="The Contrastive Loss Function, (LeCun)" />
Expand Down Expand Up @@ -323,7 +323,7 @@ <h2 id="first-results">FIRST RESULTS</h2>
# distance between the anchor and the negative
neg_dist = tf.reduce_sum(tf.square(tf.subtract(anchor,negative)),1)

#Non Linear Values
#Non Linear Values

# -ln(-x/N+1)
pos_dist = -tf.log(-tf.divide((pos_dist),beta)+1+epsilon)
Expand Down

0 comments on commit 694f593

Please sign in to comment.