Planned Maintenance
Data Flow is regularly maintained in the last week of every month.
To provide the best and most secure environment for Spark applications, Data Flow has regular maintenance in the last week of every month (adjusted for public holidays). Customers are notified two weeks in advance about the upcoming infrastructure maintenance schedule. The service automatically stops in-progress streaming runs and stars a new streaming run on the updated compute resources.
Data Flow relies on Spark structured streaming check-pointing to record the processed offset which can be stored in your Object Storage bucket. When the new compute resources are created, the streaming application resumes from the previous checkpoint.