Hi!

My previous/alt account is yetAnotherUser@feddit.de which will be abandoned soon.

  • 0 Posts
  • 3 Comments
Joined 29 days ago
cake
Cake day: June 1st, 2024

help-circle
  • Yes, but similar flaws exist for your proof.

    The algebraic proof that 0.999… = 1 must first prove why you can assign 0.999… to x.

    My “proof” abuses algebraic notation like this - you cannot assign infinity to a variable. After that, regular algebraic rules become meaningless.

    The proper proof would use the definition that the value of a limit approaching another value is exactly that value. For any epsilon > 0, 0.999… will be within the epsilon environment of 1 (= the interval 1 ± epsilon), therefore 0.999… is 1.


  • yetAnotherUser@discuss.tchncs.detoScience Memes@mander.xyzI just cited myself.
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    2 days ago

    Unfortunately not an ideal proof.

    It makes certain assumptions:

    1. That a number 0.999… exists and is well-defined
    2. That multiplication and subtraction for this number work as expected

    Similarly, I could prove that the number which consists of infinite 9’s to the left of the decimal separator is equal to -1:

    ...999.0 = x
    ...990.0 = 10x
    
    Calculate x - 10x:
    
    x - 10x = ...999.0 - ...990.0
    -9x = 9
    x = -1
    

    And while this is true for 10-adic numbers, it is certainly not true for the real numbers.