I'm using the mssparkutils.notebook.runMultiple function to orchestrate a number of notebooks from one master notebook. I've set the concurrency property to 6, which to my understanding simply allows up to 6 notebooks to run in parallel at any time, as long as their dependencies are all met.
Each notebook individually runs just fine when executed outside of the DAG I've set up, however when executed inside the DAG I keep getting a consistent error around files being added to a partition of the delta table they update. Along the lines of "Files were added to partition by a concurrent update".
Is there anything different about orchestrating notebooks in parallel in a DAG vs. not? To be clear absolutely none of these notebooks update the same delta table. Each notebook results in an update to it's own delta table.