Hacker Newsnew | past | comments | ask | show | jobs | submit | ju-st's commentslogin

Would be interesting to add a cold start + "import boto3" benchmark for Python as importing boto3 takes forever on lambdas with little memory. For this scenario I only know this benchmark but it is from 2021 https://github.com/MauriceBrg/aws-blog.de-projects/tree/mast...

I don't really use Python, but most AWS SDKs seem to be autogenerated for each language, and they're pretty much just thin wrappers over REST calls to interal AWS endpoints.

I dunno why a Python impl would be particularly heavy.


Sprawling imports of text source in hundreds of files without lazy loading.

if imports are slow one should probably look into pre-compiling .pyc files into the Lambda bundle

This is a well known issue, and the fix is not to create any boto3 clients at runtime. Instead, ensure they're created globally (even if you throw them away) as the work then gets done once during the init period. The init period gets additional CPU allocation, so this is essentially "free" CPU.

Source: I'm a former AWS employee.


Thanks for citing your sources, I think your source may be out if date, though! The “free init time hack” was killed in August (unless I’m missing something - never used it myself).

https://aws.amazon.com/blogs/compute/aws-lambda-standardizes...


Good callout that it's no longer free. However, you still get extra CPU, and assuming your execution environment isn't reloaded, that init time is amortized across all the invocations for the execution environment.

SnapStart is more widely available, which is the other option for shrinking the billed time spent in init (when I left, only Java SnapStart was available)


We have already perfectly sequestered CO2 in the ground and still dig it up and burn it. Insane.


> $ dig ns joindns4.eu +short

> ns63.cloudns.net.

> ns64.cloudns.uk.

> ns61.cloudns.net.

And US and UK have control over the TLDs of the nameservers.


  Location: southern Germany
  Remote: Yes
  Willing to relocate: Yes
  Technologies: Python (Flask, backend), C# .NET (some WPF), AWS Solution Architect (Associate, actual experience mostly in Serverless: Lambda, DynamoDB, AppSync, Glue, S3, Athena, CDK, IaC), InfluxDB, Redis, OpenWRT
  Résumé/CV: 6 years of experience working as Software Engineer in industrial IoT (IIoT) context. Focus was always getting the data from the machinery, processing it and making the data usable for the user. Processing data on the edge, providing and using a plethora of interfaces, processing data in the cloud, Data Lake, I have seen it all :-) Also worked as team lead/product owner for a while and had good success by reducing complexity. I have high standards regarding code quality. Contact me for full CV, Github.
  Email: 7ld9eecnj [at] mozmail.com


S3 Select is not available anymore for new customers. Athena with columnar file format (eg parquet) in S3 and partitioning with Glue Data catalog is the solution for OP's problem. The cost of this kind of queries is very low because you only pay the actual data consumed/requested. And with the columnar file format Athena only accesses the necessary columns. And the data in the columns is usually compressed so the amount of data is even less.


  Location: southern Germany
  Remote: Yes
  Willing to relocate: Yes
  Technologies: Python, C# .NET (some WPF), AWS Solution Architect (Associate, actual experience mostly in Serverless: Lambda, DynamoDB, AppSync, Glue, Athena, CDK/IaC), InfluxDB, Redis, OpenWRT, Flask
  Résumé/CV: 6 years of experience working as Software Engineer in industrial IoT (IIoT) context. Focus was always getting the data from the machinery, processing it and making the data usable for the user. Processing data on the edge, providing and using a plethora of interfaces, processing data in the cloud, Data Lake, I have seen it all :-) Also worked as team lead/product owner for a while and had good success by removing complexity. I have high standards regarding code quality. 
  Email: 7ld9eecnj [at] mozmail.com


I had the same question and asked Claude Sonnet, see its answer below.

In the context of a workflow engine like Netflix Conductor, a workflow refers to a structured sequence of tasks or activities that need to be executed to complete a specific business process or achieve a particular goal. [...]

To give you a concrete example, imagine an e-commerce platform's order processing workflow:

1. Validate order 2. Check inventory 3. Process payment 4. If payment successful: a. Reserve inventory b. Initiate shipping c. Send confirmation email 5. If payment fails: a. Cancel order b. Notify customer

In this workflow, each step could be a separate microservice or function. The workflow engine would orchestrate the execution of these steps, handling the flow of data between them, managing any errors or retries, and ensuring the entire process completes successfully.

Would you like me to elaborate on any specific aspect of workflows in this context? Or perhaps you're curious about how they're implemented or managed in practice?


Printing pure copper (99.9%) is possible since years with SLM process using lasers with an appropriate wavelength


Any references for 99.9% density with SLM copper? My understanding is that pure copper SLM printing is less frequently done as doesn’t work well with the infrared lasers on most machines, requires high heat & speed, and has more porosity than other alloys. It’s also hard to print so that it’s strong, conductive and heat stable.

I think there’s still quite active research in the area, though, and no doubt there’s a lot going on that I don’t know! https://www.sciencedirect.com/science/article/pii/S026412752...

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10861549/


Sorry I wasn't talking about density but the copper content of a powder which is printable. Googling a bit I found this presentation from 2022 showing that a density of 99.5% for pure copper is possible although at half the productivity of a copper alloy https://www.coppercouncil.org/wp-content/uploads/2022/02/TS2...


> doesn’t work well with the infrared lasers

The copper use-case is what kick-ed off an industry-wide race towards offering blue laser as an option. There is more than just wavelength that goes into printing good copper results, but that is a major factor.


The 'full load' designation may be a distraction as the research appears to be focused on improving the cooling in general which obviously enables the use of less efficient and cheaper electronics.


I remember some posts on HN describing ASML's software quality as atrocious.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: