So let me throw in a bunch of random musings based on recent experience. Hopefully some of my descriptions will be helpful.
I'm currently making my new sample library in both Kontakt and in Sfz format. I primarily use Kontakt because the tools are easier to use, such as the ability to have a filters change based on velocity, or simply the ability to control how much a velocity layer changes in volume from low to high - which makes the transitions between velocity layers less noticeable. SFZ format allows for all of this too, but you have script it rather than use a graphical tool. So I make it first in Kontakt, and then mimic the curves when I make the SFZ version.
Now, you can go crazy and sample all 88 keys at 127 velocity layers using pedal up and pedal down, release noises, una corda pedal, etc. But I've found that just recording minor thirds up the piano is enough too get good realism (always pitch down, never pitch up the samples), I did fourths at first, but when you play major scales, hearing 3 notes in succession that were pitch shifted sounds fake. Minor thirds is about the level at which you save time recording/processing/mapping without sacrificing too much realism. I record pedal up and pedal down at 4 velocity layers, which allows me to get the full breadth in brightness of the piano. Using the 4 velocity layers, you can also make "virtual velocity layers" which I can describe in more detail, but basically is just taking a louder layer, cutting the gain, and applying a strategic cutoff filter to mimic the brightness at that velocity range. Its amazing at how that sort of processing can be used to take a minimal amount of samples and create something pretty realistic.
So with those tools, a new sample based virtual instrument engine needs to have a basic sample playback engine in the very least, and then you have to decide how much pre-processing you want to do to the samples, or have the engine do the filtering/processing in realtime (like Kontakt). Feel free to use any of my sample libraries to experiment.
In recent weeks, I have be scripting half pedaling mimicking Garritan. I've found that SFZ is much easier because it allows me to do realtime amplifier envelope changes - that is, I can get CC64 values to change the release time and volume in realtime. Kontakt, however, does not allow you to do this unless you record ANOTHER sample to control the entire ADSR of that sample. So what I'm trying to say is that when you make a piano playback engine, you might want to consider how you will control ADSR envelopes.
Hope some of that makes sense.