?

Free Version
Moderate

# Current Density Within a Wire Having Variable Resistivity

EANDM-@XHA0B

Figure 1 depicts a wire of outer radius $R = 0.700 \text{ mm}$ and length $l = 6.00 \text{ mm}$, containing a central hole of diameter $s = 0.400 \text{ mm}$.

The wire has a resistivity which depends on the wire radius as:

$$\rho = \frac{B}{r}$$

…where $B = 2.54 \times 10^{-6} \; \Omega \cdot \rm{m}^2$.

If a 10 V potential is held across the ends of the wire, what is the magnitude of the current density at a distance of $c = 0.4 \text{ mm}$ from the center of the wire?

A

$J = 6.56 \times 10^8 \; \rm{\frac{A}{m^2}}$

B

$J = 1.57 \times 10^3 \; \rm{\frac{A}{m^2}}$

C

$J = 2.63 \times 10^5 \; \rm{\frac{A}{m^2}}$

D

$J = 1.57 \times 10^6 \; \rm{\frac{A}{m^2}}$