Avatar

joba2ca

joba2ca@feddit.de
Joined
2 posts • 28 comments
Direct message

Thanks! I did not know that. Very neat :)

permalink
report
parent
reply

Ja moin, danke!

permalink
report
reply

What exactly happened? The reasoning in the graphic does not tell me much. I only saw the summary which listed him as second. He seemed having finished his lap before the red flag, no?

permalink
report
parent
reply

Despite a size reduction of the cars, I cannot imagine how to save a significant amount of weight. The biggest weight contributors besides the chassis are engine and battery, right?

permalink
report
reply

Not available in all countries, though :(

permalink
report
parent
reply

I had the pleasure of conducting research into self-supervised learning (SSL) for computer vision.

What stood out to me was the simplicity of the SSL algorithms combined with the astonishing performance of the self-supervisedly trained models after supervised fine-tuning.

Also the fact that SSL works across tasks and domains, e.g., text generation, image generation, semantic segmentation…

permalink
report
reply

Do you happen to have a good source to read up on continual RL that you can recommend? I am not familiar with this use case for RL.

permalink
report
parent
reply

Depends on the use cases I guess. If any larger scale deep learning is going on, you cannot afford buying all the required GPUs anyways.

However, I found myself using my tower PC quite a lot during my Masters. Especially for Uni projects my GPU came in very handy and was much appreciated by group members. Having your own GPU was often more convenient than using the resources provided by the lab.

Also, while relying mostly on cloud resources in my last job, I would have found having a GPU available on my work machine very convenient at certain times. Very nice for EDA and playing with models during the early phase of a project.

Besides from that, IMO a good CPU and > 32GB RAM on your own machine are sufficient for EDA and related things while I would rely on cloud resources for everything else, e.g., model training and large scale analyses.

permalink
report
parent
reply