The recent miniaturization of cameras has enabled finger-based reading approaches that provide blind and visually impaired readers with access to printed materials. Compared to handheld text scanners such as mobile phone applications, mounting a tiny camera on the user's own finger has the potential to mitigate camera framing issues, enable a blind reader to better understand the spatial layout of a document, and provide better control over reading pace. A finger-based approach, however, also introduces the need to guide the reader in physically navigating a document, such as tracing along lines of text. While previous work has proposed audio and haptic directional finger guidance for this purpose, user studies of finger-based reading have not provided an in-depth performance analysis of the finger-based reading process. To further investigate the effectiveness of finger-based sensing and feedback for reading printed text, we conducted a controlled laboratory experiment with 19 blind participants, comparing audio and haptic directional finger guidance within an iPad-based testbed. As a small follow-up, we asked four of those participants to return and provide feedback on a preliminary wearable prototype called HandSight. Findings from the controlled experiment show similar performance between haptic and audio directional guidance, although audio may offer an accuracy advantage for tracing lines of text. Subjective feedback also highlights trade-offs between the two types of guidance, such as the interference of audio guidance with speech output and the potential for desensitization to haptic guidance. While several participants appreciated the direct access to layout information provided by finger-based exploration, important concerns also arose about ease of use and the amount of concentration required. We close with a discussion on the effectiveness of finger-based reading for blind users and potential design improvements to the HandSight prototype.