There are three aspects to this, Joe:
1) film grain vs. digital noise
2) Tonal response, colors, contrast, etc.
3) Digital vs. wet printing
Regarding 1), I think you know. Hard to summarize in a single statement because there are a lot of degrees of freedom. Depends on which film/developer, ISO capabilities of the digital camera, etc. But I think most of us see the difference when a picture is blown up enough.
Let's skip 3), since I assume that you are talking about looking at an image on a monitor (the difference will largely depend on exactly how you print digitally or wet).
Regarding 2), modern monitors have a dynamic range of 8bit / color.
With modern scanners or digital cameras, 12-14 bit/color dynamic range is typical. Depending on the film, a film camera can produce a negative with up to 17bit dynamic range or so. However, you have to scan it to end up on your monitor, reducing that range to the dynamic range of the scanner (or less). And: whatever input (film or digital) you use, the additional 4-6bit dynamic range of a digital camera or scanner over the monitor's 8bit is enough to mimic any tonal film response digitally; provided, of course, that your picture was exposed correctly. Therefore 2) reduces to nothing. Somebody claiming a difference just doesn't know how to use photoshop.
If you look at the final picture on a monitor, digital vs. film means noise vs. grain. That's the only rational difference.
Roland.
PS: now, "what's considered" is not necessarily rational, but that's another story 🙂