Abhishek Bhatia
11/14/2024, 5:13 PM[11/14/24 17:09:07] WARNING /root/.venv/lib/python3.9/site-packages/kedro/framework/startup.py:99 warnings.py:109
: KedroDeprecationWarning: project_version in pyproject.toml is
deprecated, use kedro_init_version instead
warnings.warn(
[11/14/24 17:09:15] INFO Kedro project project session.py:365
[11/14/24 17:09:17] WARNING /root/.venv/lib/python3.9/site-packages/kedro/framework/session/sessi warnings.py:109
on.py:267: KedroDeprecationWarning: Jinja2TemplatedConfigLoader will
be deprecated in Kedro 0.19. Please use the OmegaConfigLoader
instead. To consult the documentation for OmegaConfigLoader, see
here:
<https://docs.kedro.org/en/stable/configuration/advanced_configuration>
.html#omegaconfigloader
warnings.warn(
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
24/11/14 17:09:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[11/14/24 17:12:53] WARNING /root/.venv/lib/python3.9/site-packages/pyspark/pandas/__init__.py:49 warnings.py:109
: UserWarning: 'PYARROW_IGNORE_TIMEZONE' environment variable was not
set. It is required to set this environment variable to '1' in both
driver and executor sides if you use pyarrow>=2.0.0. pandas-on-Spark
will set it for you but it does not work if there is a Spark context
already launched.
warnings.warn(
• kedro: 0.18.14
• python: 3.9
• Running inside a docker container (since requirements don't compile on M* macs)
I understand this is too less information to help, but I have the same problem. Is there any place I could look into to see where it is stuck?Hall
11/14/2024, 5:13 PMJuan Luis
11/14/2024, 5:17 PM[11/14/24 17:12:53]
log nothing happens?Abhishek Bhatia
11/14/2024, 5:26 PMAbhishek Bhatia
11/14/2024, 5:26 PMJuan Luis
11/14/2024, 5:27 PMWARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
?Abhishek Bhatia
11/14/2024, 5:30 PMAbhishek Bhatia
11/15/2024, 1:56 AMAbhishek Bhatia
11/15/2024, 6:44 AMspark.driver.cores
to 4, and then it ran smoothly.
Might be related to how pyspark installation on different environment set driver cores differently, hence the difference. (not sure)Juan Luis
11/15/2024, 8:28 AM