We can roughly approximate the answer by considering how wide a swath of shade the Earth casts, relative to the total length of the orbit.
At 500km altitude + 6371 km Earth radius (= 6871km), the satellite's circular orbit describes a circle $2 π r$ in circumference, or 43,172 km.
Assume rays of sunlight at Earth orbit to be essentially parallel, so the width of the shadow cast on the orbit is the width of Earth, 12,742 km.
So the sat will spend about 12742/43172 of its time in shadow, or 1676 seconds (27.93 minutes) on each orbit.
That's a severe underestimate, though. For more accurate computation you'd need to take into account that the shadow width is a chord across the orbit rather than a circular arc -- and since it's a large fraction of the orbit, that difference is pretty significant. (If the orbital altitude were much higher, the chord would be much closer to the arc, and this method would be more accurate.)
For that, you can do a little bit of trig; if c is the shadow width (Earth diameter) and r is the orbital radius (Earth radius plus orbital altitude), then the angular width of the shadowed arc is $$2 \arcsin \frac c {2r} $$ or 136 degrees; 0.378 of the circle, or 2145 seconds (35.74 minutes). See HopDavid's answer for the excellent diagram corresponding to this.
For still more accurate computation you'd need to take into account the sun's distance and radius, and make decision about umbra vs. penumbra.