Here's a link to what Pixii says they are doing:
Monochrome mode. They aren't shifting the sensor because there's no way to shift it; it's rigidly attached to the camera. (There's also no IBIS and no "ultrasonic" dust removal.)
Short summary of their longer explanation from their blog: In any Bayer-array color digital image, only the luminance (amount of light) value of each sensor pixel is an actual measurement. The color values are a computation: one color value is computed based on the characteristics of the filter over that sensor pixel, and the other two are estimated based on nearby pixels.
Because Pixii knows exactly how the color values are computed, and knows the exact characteristics of the filter array over the sensor, they can reverse the computation process to factor out the chroma computations and reconstruct the original luminance measurement. The only thing that gets lost is the light that's absorbed by the color filter and doesn't have a chance to affect the sensor pixel in the first place; Pixii says this amounts to about 1 stop.
(So why doesn't everyone do it that way? For one thing, because Pixii has a patent application on file...)
FYI, Pixii's latest owner email includes a link to a
post by a photo enthusiast named Jim Chung, who compared the Pixii's monochrome mode to a monochrome image produced by sensor-shifting (using an OM Systems OM-1) and to one produced by a true monochrome sensor, the Phase One IQ260 Achromatic.