Question:

Consider the two neural networks (NNs) shown in Figures 1 and 2, with ReLU activation (ReLU(z) = max{0, z}, ∀z ∈ R). R denotes the set of real numbers. The connections and their corresponding weights are shown in the Figures. The biases at every neuron are set to 0. For what values of p, q, r in Figure 2 are the two NNs equivalent, when x1, x2, x3 are positive ?
Figure 1
Figure 2

Updated On: Aug 29, 2024
  • p = 36, q = 24,r = 24
  • p = 24, q = 24,r = 36
  • p = 18, q = 36,r = 24
  • p = 36, q = 36,r = 36
Hide Solution
collegedunia
Verified By Collegedunia

The Correct Option is A

Solution and Explanation

The correct option is (A) : p = 36, q = 24,r = 24.
Was this answer helpful?
1
0