Hey folks! I think this request is right up this comm’s alley. I’m sure that we all know bogo sort but, what other terrible/terribly inefficient algorithms, software architecture, or design choices have you been horrified/amused by?
I, sadly, lost a great page of competing terrible sorting algorithms, but I’ll lead with JDSL as a terrible (and terribly inefficient) software architecture and design. The TL;DR is that a fresh CS guy got an internship at a company that based its software offering around a custom, DSL based on JSON that used a svn repo to store all functions in different commits. The poor intern had a bad time due to attempting to add comments to the code, resulting in customer data loss.
No, I won’t give you my github.
That’s ok. I’ve only just met you and am not sure that I’m ready to
commit
.
That’s incredible.
Always worth read: How to write unmaintainable code
great. time to put that in practice.
slow inverse square root:
float slowinvsqrt(float x) { const long accuracy = 100000000; // larger number = better accuracy if (x <= 0.0f) { return NAN; } if (x == 1.0f) { return 1.0f; } if (x < 1.0f) { return 1.0f / slowinvsqrt(1.0f/x); } int max_power = log(accuracy) / log(x); long pow1 = pow(x, max_power - 1); long pow2 = pow(x, max_power); double current = 1.0; double previous = 1.0; for (long i = 0; i<10*accuracy; i++) { current = sin(current); if (i == pow1) { previous = current; } if (i == pow2) { return current / previous; } } }
“EvErYtHiNg ShOuLd Be A mIcRo SeRvIcE” --Executive Who Doesn’t Have to Maintain Said Microservices
I unironically had a screening interview with a recruiter that asked “If you were creating a startup, would you use microservices?”. She didn’t like that my answer was “It depends, I don’t have enough information to answer”.
“If you were making food, would you use onion powder?”
There are subfields of computer science dedicated to this question. A good starting point for the theory would be Pessimal algorithms and simplexity analysis, which lays out two concepts:
- The time & space simplexity of an algorithm indicates best-case lower bounds on resource usage, and
- An algorithm is pessimal if no equivalent algorithm wastes more time/space/etc.
For example, common folklore is that sorting has
O(n lg n)
time complexity, depending on assumptions. In the paper, they give that sorting hasΩ(n ** (lg n / 2))
time simplexity; any algorithm which takes more time, like bogosort, must do so through some sort of trickery like non-determinism or wasting time via do-nothing operations.πfs: The Data-Free Filesystem!
sleep sort
Technically, sleep sort is O(n), so faster than the theoretical optimal sorting algorithm O(n.log n) … not so bad ;)
Sounds like you should look at a few years of https://thedailywtf.com entries. Enough to make the staunchest man (or woman) weep.
https://thedailywtf.com/articles/gotta-catch-em-all
Dear God.
try { /* ... some important code ... */ } catch (OutOfMemoryException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (OverflowException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (InvalidCastException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (NullReferenceException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (IndexOutOfRangeException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (ArgumentException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (InvalidOperationException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (XmlException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (IOException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (NotSupportedException exception) { Global.Insert("App.GetSettings;", exception.Message); } catch (Exception exception) { Global.Insert("App.GetSettings;", exception.Message); }