Why The 1970 Bug Bricks Your iPhone

Thanks to scum-of-the-internet 4chan, we've all been hearing lately about a particular iOS bug that will brick modern iPhones, if you set the date back to 1/1/1970. Why does that happen? YouTuber Tom Scott explains. Scott gives a quick lesson in Unix time to explain why he thinks the bricking happens: for your iPhone, time is displayed as one single integer, representing the number of seconds since 1970. If you set the time back to 1 January 1970, that value becomes 0 — not a problem in and of itself, but if the phone tries to display a time before that, say a text you received a few hours ago, it will generate a negative number, which causes a crash.

This is only a theory — Apple hasn't confirmed what causes the bug, and probably never will. Even so, Scott's video is worth watching, both as a lesson in how computers intrepret time, and a cautionary tale for programmers everywhere.

If you've bricked your phone, the good news is it isn't gone forever: you need to either let the battery run entirely down (slow), pry the phone open and disconnect the battery (scary) or perform a Device Firmware Update (hard). Or take it to an Apple Store, and ask them nicely not to laugh at you.



    After Y2K you think bad use of integers for time keeping would be avoided in design... especially after the Billions wasted preventing y2k errors in legacy systems

      As one of those 'money wasters' I can assure you that the reason nothing went wrong was because that money was spent. It was a very real problem and the idiots who argue that the money was wasted because nothing went wrong haven't a clue. I saw and fixed mountains of code that would have crashed and burned after 1/1/2000.

      Would you decry the 'waste' of money spent on defence when no wars occur, or the money spent on firefighters when there are no fires...

        I'm pretty sure what Dirtyshadow meant by it being a 'waste' was not that it didn't need to be spent... but that it was a huge waste of money relative to the industry having the forethought to predict that legacy systems might still actually be in use at the turn of the century.

        It needed to be spent, but it was a huge waste of money to pay people to hunt through code they may not have even be familiar with, compared to the expenditure involved if the original designers had factored in a new millennium's impact on date fields.

    UNIX standard is to use a signed integer for time_t, so there should be no risk of underflow at the 1970 epoch. Either Apple (foolishly) changed the value in the OS to be unsigned, or more likely some subsystem is storing the result of a time calculation incorrectly in an unsigned integer variable during startup and then shitting itself when trying to do a comparison.

Join the discussion!

Trending Stories Right Now