In this paper we present and evaluate painterly rendering techniques that work within a visual feedback loop of eDavid, our painting robot. The machine aims at simulating the human painting process. Two such methods are compared for different objects. One uses a predefined set of stroke candidates, the other creates strokes directly using line integral convolution. The aesthetics of both methods are discussed, results are shown.