Issue #18899 has been updated by Eregon (Benoit Daloze).
I've taken a look in `IO#set_encoding` recently and it's such an unreadable mess,
I think nobody would be able to explain its full semantics.
So anything to simplify it would IMHO be welcome.
I think `IO#set_encoding` should simply set the internal/external encodings for that IO,
with no special cases and not caring about the default external/internal encodings.
If some cases don't make any sense they should raise an exception.
----------------------------------------
Bug #18899: Inconsistent argument handling in IO#set_encoding
https://bugs.ruby-lang.org/issues/18899#change-100274
* Author: javanthropus (Jeremy Bopp)
* Status: Open
* Priority: Normal
* ruby -v: ruby 3.1.2p20 (2022-04-12 revision 4491bb740a) [x86_64-linux]
* Backport: 2.7: UNKNOWN, 3.0: UNKNOWN, 3.1: UNKNOWN
----------------------------------------
`IO#set_encoding` behaves differently when processing a single String argument than it
does when processing 2 arguments (whether Strings or Encodings) in the case where the
external encoding is being set to binary and the internal encoding is being set to any
other encoding.
This script demonstrates the resulting values of the external and internal encodings for
an IO instance given different ways to equivalently call `#set_encoding`:
```ruby
#!/usr/bin/env ruby
def show(io, args)
printf(
"args: %-50s external encoding: %-25s internal encoding: %-25s\n",
args.inspect,
io.external_encoding.inspect,
io.internal_encoding.inspect
)
end
File.open('/dev/null') do |f|
args = ['binary:utf-8']
f.set_encoding(*args)
show(f, args)
args = ['binary', 'utf-8']
f.set_encoding(*args)
show(f, args)
args = [Encoding.find('binary'), Encoding.find('utf-8')]
f.set_encoding(*args)
show(f, args)
end
```
This behavior is the same from Ruby 2.7.0 to 3.1.2.
--
https://bugs.ruby-lang.org/