Can we assume that any spec, or measurement data, from before M0, M1, M2, M3 were ratified is M0 by default?
AFAIK M0 instrumentation for ISO/GRACoL measurement data was used up until about 2006 - despite the fact that OBAs were in most of the offset papers used back then (and today). (OBAs were/are not generally used in the proofing papers). Instrument measurement illumination condition M0 does not define UV content so M0 is not recommended for use when measured sheets (or inks/colorants) exhibit fluorescence (the typical situation) and there is a need to exchange measurement data between facilities.
Now, ISO and ANSI/CGATS (around 2013) introduced a new set of standards and profiles based on M1 and the good old ISO 12647-2 standard was updated using M1 measurements so the new G7 based CGATS.21 and ISO 15339 are all based on M1 measurement - even though M0 is still by far the most common instrument measurement illumination used today.
I believe that differences will be significant for different Status (T, E) responses and/or when a polarizer (M3) is added - polarized instruments (status E) are pretty typical in Europe but not in N. America (status T).
So you've really got to be very careful about how and what you're measuring relative to which standard/specification you're aiming for.
Oh! Watch you step I think a worm is trying to get away from the can! ;-)
There really needs to be a practical guide that an ordinary commercial printer can use to understand all this and how to go about aligning to a standard and follow specifications. The ones that are out there (e.g. published by ISO, X-Rite, Techkon, Idealliance et.al.), IMHO, are next to useless for the average printer.
I would be happy to participate in such an undertaking in order to provide the "dummies" perspective.