Our Snippets Of Knowledge Weekly Blog Series
The Azure Batch Service is the unsung hero of many other cloud workloads, this includes supporting Azure Data Factory custom activities and Azure Machine Learning model processing. The resource offers a simple but affective abstraction for the execution of custom code, defined in the resources as jobs. This is like calling Azure Functions but configured differently for scaling, long running workloads and improved job handling. Inside the service, compute pools offer a cluster of virtual machines (VMs), with one node in the pool representing one VM. These are defined in the same as any Azure VM with an operating system, compute, RAM and storage. Including remote desktop access to every node for debugging if required. Additionally, the pools support virtual network access (VNet) for wider platform connectivity.
At runtime jobs can be passed to the resource and executed on the compute pool(s). If many jobs are requested or scheduled, the pool will monitor throughput and scale out the number of nodes (VMs) to handle parallel task execution. Including scaling in once complete.
See MS Learn for more information on this Resource here.
We hope you found this knowledge snippet helpful.
Check out all our posts in this series here.
Comments