msanders / cider

Hassle-free bootstrapping with Homebrew.
https://pypi.python.org/pypi/cider/
MIT License
839 stars 25 forks source link

set-default is converting bool into int #28

Closed RogerThiede closed 9 years ago

RogerThiede commented 9 years ago

When using cider set-default -bool, the given domain key is properly stored into defaults.yaml but the given domain key is not properly applied to the system. boolean true and false are converted into int.

» cider set-default -bool com.apple.desktopservices DSDontWriteNetworkStores TRUE
» defaults read com.apple.desktopservices                                        
{
    DSDontWriteNetworkStores = 1;
}
» cider set-default -bool com.apple.desktopservices DSDontWriteNetworkStores true
» defaults read com.apple.desktopservices
{
    DSDontWriteNetworkStores = 1;
}

» cider set-default com.apple.desktopservices DSDontWriteNetworkStores TRUE
» defaults read com.apple.desktopservices                                  
{
    DSDontWriteNetworkStores = 1;
}
» cider set-default com.apple.desktopservices DSDontWriteNetworkStores true
» defaults read com.apple.desktopservices                                  
{
    DSDontWriteNetworkStores = 1;
}

Expected behavior is:

» defaults write com.apple.desktopservices DSDontWriteNetworkStores true
» defaults read com.apple.desktopservices                               
{
    DSDontWriteNetworkStores = true;
}
» 
RogerThiede commented 9 years ago

Actually this might be my misunderstanding with OS X defaults write, but I'd like another opinion on it.

msanders commented 9 years ago

I think this may be a quirk with the defaults command. You can see the command cider uses if you pass in the --debug flag:

$ cider --debug set-default -bool com.apple.desktopservices DSDontWriteNetworkStores TRUE
==> defaults write com.apple.desktopservices DSDontWriteNetworkStores -bool True
$ defaults read com.apple.desktopservices
{
    DSDontWriteNetworkStores = 1;
}

In your last example, it looks like you're setting the value to the string "true", which may or may not do what you want (Objective-C apps often implicitly convert these values to true).

RogerThiede commented 9 years ago

That makes sense. The OS X defaults application is choosing to store bool's as 0 or 1. My example in expected behavior was a bad example because I was actually setting a string (which ends up also having the desired effect.)

Not a bug.

Thanks.