J. Camilo V. Tieck
07/11/2023, 10:49 PMDeepyaman Datta
07/11/2023, 10:57 PM-> we have tried a lambda function that has the code for [3.0] and has access to the S3 location [4]. this works, but the code for the post-processing [3.0] is duplicated.Why is the code duplicated? I don't think I understood from your writeup. Is the problem that you don't package the code for [3.0] in a way that it can be run from both the service doing scheduled execution and the one for on-demand execution?
J. Camilo V. Tieck
07/11/2023, 11:34 PMJ. Camilo V. Tieck
07/11/2023, 11:35 PMDeepyaman Datta
07/11/2023, 11:41 PM• I tried micro-packaging the pipeline and importing it in the lambda, but the dependency was too large and jenkins had some issues with it.This is in line with the approach I'd take. You could push the generated wheel to a shared space, like S3 or even an internal PyPI respository. That wheel would be used by both deployments. You could alternatively have a CI process that builds the Docker image for EC2 + Docker image for Lambda (I don't have enough recent hands-on experience with AWS to say whether you coul duse the same image), and use those in your deployment. There are also "hackier" ways of doing things, like dynamically referencing the pipeline code from a shared location, but not sure why should do that instead of the above two options.
J. Camilo V. Tieck
07/11/2023, 11:48 PMJ. Camilo V. Tieck
07/11/2023, 11:48 PMDeepyaman Datta
07/11/2023, 11:52 PMJ. Camilo V. Tieck
07/11/2023, 11:56 PMJ. Camilo V. Tieck
07/12/2023, 12:33 AMJ. Camilo V. Tieck
07/12/2023, 12:34 AMDeepyaman Datta
07/12/2023, 12:42 AMJuan Luis
07/12/2023, 6:16 AMJ. Camilo V. Tieck
07/12/2023, 3:36 PMJuan Luis
07/12/2023, 3:48 PMJ. Camilo V. Tieck
07/12/2023, 9:31 PM