Summary
Most European countries moved clocks forward one hour on Sunday, marking the start of daylight saving time (DST), a practice increasingly criticized.
Originally introduced during World War I to conserve energy, DST returned during the 1970s oil crisis and now shifts Central European Time to Central European Summer Time.
Despite a 2018 EU consultation where 84% of nearly 4 million respondents supported abolishing DST, implementation stalled due to member state disagreement.
Poland, currently holding the EU presidency, plans informal consultations to revisit the issue amid broader geopolitical priorities.
In my own experience, ideally you try to avoid using such interfaces. If however you’re forced to handle such things (which is far too common) the design which is cheapest and safest is to convert to UTC using a suitable default timezone at the interface level and store the result in your core system time field AND also store the local time but not in a field that you actually use for queries and computations in the core. If (more likely, when) some of those times converted with a “suitable” default turn out to have been wrong in some way - which is not necessarilly something due to the timezone conversion - you can manually fix just those (ideally with bulk data update).
Mind you, a lot of this shit needs to be solved at the systems design and requirements specifications level - either it’s accepted that the system will have a fraction of the time data wrong (and it will always do anyway, even without timezones: users enter wrong dates, OCR data reading can’t correct for users filling-in the wrong time in a time field on a document, timestamps generated by machines whose internal clocks are not regularly synched with NTP serves can be off my many minutes and so forth), or the whole thing is designed as I described above so that all data is treated as compatible and when it inneviably turns out some times in some fields were wrong or incorrectly translated, you can fixe it in an non-automated way.
As much as the dream is to have the computer do everything itself in code and the data be perfect, that’s incompatible with the real world, and that’s for way more things than just time values.
The point is, again, that programmers have to deal with the world as is (and dates are hardly the only “quirk” around), not the world as they would like it to be, and that needs to be dealt with already at the level of system design by the (supposedly) senior designers and technical architects rather than having programmers running around fixing the innevitable problems in a system whose design does not take in account the quirks of how certain kinds of data are produced and consumed: proper systems design is about minimising the direct and indirect consequences of data errors, inconsistencies and datatype-specific quirks, not trying to fulfill expectations that all data in one’s system is perfect.