Hi Gents,
Cybervex,
This is not a common occurrence.
It can happen when the system is allowed to boot with improper configuration. In the case of RAID 1 or 5, the array usually starts, but as degraded and the mirror or parity drive has to be rebuilt. If "0" the array does not usually "break", and can be recovered by entering the RAID Configuration Utility and "remarking" the disks as members. Now the caveat. UEFI RAID has become part of the BIOS, and in most cases a flash does reset the BIOS configuration. Even if it doesn't, it's good practice to do so in order to avoid complications, or stability issues later. Examples are feature enhancements, updated microcode or XMP profiles (CPU & Memory). Updates to the embedded IME or version of the RAID BIOS can also be the cause. This information may not be provided in the release notes, so we don't know.
In this case however, my feeling is the lack of persistency caused by the reset. If I were having issues and had to upgrade my BIOS, this is what I would have done. Disconnected the drives before upgrade, and ensured I entered BIOS and reconfigured the RAID before the system ever made it through full POST (first restart, with drives reconnected). How are you suppose to know this?? You're not. RAID 0 isn't overly robust. Add the newer NVMe architecture (most manufacturers including Intel) are still flying by the seat of their pants with implementation. Reliability/stability is a crapshoot. If this is a bug, it's horrible and I feel your pain. If however, you didn't immediately enter BIOS and attempt to get the RAID configured before hand off to the OS loader.. well, you were likely a casualty of your own good intentions. Remember, its common and usually expected that a BIOS should be reset after an upgrade or flash. The newer ones might do it for you automatically.
Speculating here of course as we don't have all the facts.