In order to measure 100 Amps DC I am using a multimeter and a 18.75" length of 0.128" Diameter solid copper wire (#8).
The voltage drop measured across the wire in mllli-Volts directly equals the current in Amps.
Can you explain how you calibrate your device? Are you using a low-resistance ohm-meter to measure the resistance of the copper wire or do you have a calibrated current source?
Home Shop Freeware
There are currently 1 users browsing this thread. (0 members and 1 guests)