Encountering a ModuleNotFoundError while running a PySpark job can be frustrating, especially when the module exists and works perfectly outside of Spark. I recently ran into this exact issue while working with a Spark job that imported a function from a local utils module. Everything seemed fine until I tried to use a PySpark UDF, and suddenly, the module was nowhere to be found.
Thank me by sharing on Twitter 🙏
After some debugging, I found a straightforward way to fix it that worked perfectly for me.
Understanding the ModuleNotFoundError Problem
PySpark runs code on distributed nodes, and those nodes might not have access to the same Python environment as the driver (the machine running the job). When a Python UDF is executed on worker nodes, it gets serialized and shipped over the network. If the UDF relies on a module that isn’t available on the workers, PySpark throws a ModuleNotFoundError.
That’s why an import statement might work fine at the top of a script but fail when running inside a UDF. PySpark workers simply don’t know where to find the custom module unless explicitly told.
Fixing the Import Error in PySpark UDFs
Import Inside the UDF
The solution that worked for me was to move the import statement inside the function being used as a UDF. This ensures that the necessary module is available when the function is executed on worker nodes.
SanDisk 256GB Ultra microSDXC UHS-I Memory Card with Adapter - Up to 150MB/s, C10, U1, Full HD, A1, MicroSD Card - SDSQUAC-256G-GN6MA [New Version]
$22.99 (as of November 19, 2025 01:28 GMT +00:00 - More infoProduct prices and availability are accurate as of the date/time indicated and are subject to change. Any price and availability information displayed on [relevant Amazon Site(s), as applicable] at the time of purchase will apply to the purchase of this product.)iPhone Charger 3 Pack 10 ft Apple MFi Certified Lightning Nylon Braided Cable Fast Charging Cord Compatible with iPhone 13 12 11 Pro Max XR XS X 8 7 6 Plus SE iPad and More
$6.98 (as of November 19, 2025 01:28 GMT +00:00 - More infoProduct prices and availability are accurate as of the date/time indicated and are subject to change. Any price and availability information displayed on [relevant Amazon Site(s), as applicable] at the time of purchase will apply to the purchase of this product.)Anker USB C to USB C Cable (6FT, 2Pack), Type-C 100W Charger Cord Fast Charging for iPhone 17 Series,MacBook Pro 2020,Pixel and More(Black)
$12.99 (as of November 19, 2025 01:28 GMT +00:00 - More infoProduct prices and availability are accurate as of the date/time indicated and are subject to change. Any price and availability information displayed on [relevant Amazon Site(s), as applicable] at the time of purchase will apply to the purchase of this product.)Before (Fails with ModuleNotFoundError):
from utils.formatting import format_text # This works on the driver, but fails inside UDF
def format_text_local(text):
return format_text(text) # Might break when run inside SparkAfter (Works with Spark UDFs):
def format_text(text):
import sys
import os
sys.path.append(os.path.abspath(os.path.dirname(__file__)))
from utils.formatting import format_text as format_text2
return format_text2(text)By placing the import inside the function, it gets executed on the worker nodes where the function actually runs.
Conclusion
PySpark’s distributed nature makes dependency management tricky, but in my case, simply importing the module inside the UDF was enough to resolve the issue. If you’re running into a ModuleNotFoundError when using a PySpark UDF, this approach is worth trying first before considering more complex solutions.


