Interlisp / medley

The main repo for the Medley Interlisp project. Wiki, Issues are here. Other repositories include maiko (the VM implementation) and Interlisp.github.io (web site sources)
https://Interlisp.org
MIT License
369 stars 23 forks source link

TEDIT scaling of hardcopy-display mode #968

Open rmkaplan opened 1 year ago

rmkaplan commented 1 year ago

Tedit has a feature of display a paragraph using fonts with widths that are scaled to what they would be on a corresponding hardcopy stream but with glyphs that are taken from the display font. The result is ugly, often with overlapping characters, but you can see how the hardcopy lines would justify and be laid out.

There is some confusion in the code about how this is implemented, in particular, what scale should be used at various points so that margins and selections take account of the scaled widths.

Since display dimensions (and display fonts) are measured in points and hardcopy fonts are measured with much greater precisition, the general idea is to scale hardcopy widths to points, and to scale the resulting high-precision line-lengths also back to points.

One confusion in the code is what exactly to use as the scaling factor. Postscript fonts appear to have a scaling factor of 100 (i.e. specify widths in 100's of a point). But the code has a lot of places where it uses the constant MICASPERPT (=35.27778, i.e. scaled in terms of 10 microns).

I presume (vaguely remember) that Interpress used micas, and the code therefore has inconsistencies when we go to hardcopy-display mode for postrcript.

The hardcopy menu item in the paragraph menu doesn't let you specify your target file format, I think it just assumed Interpress. But however it is specified (maybe the preferred format of the default printing host, which would now be postscript), it seems that all of the scaling and unscaling should be with respect to the scale of fonts for that device, and that the code should no longer make reference to mica-based constants.

Does that seem right? Alternatively, I suppose we could pick micas or 100ths as our own uniform internal measurement precision, and convert the font-widths when we read them in.

Any thoughts?

nbriggs commented 1 year ago

There are two coordinate systems in an Interpress master, the master’s coordinates (which are arbitrary and defined by the emitter) and the device coordinates. The renderer translates between them with the information it is given in the master and the information that it knows for the device it is rendering on. PostScript works the same way.

From page 42 of the Interpress standard, http://bitsavers.org/pdf/xerox/interpress/610P72582_XSIS_048404_Interpress_Electronic_Printing_Standard_V2.1_198404.pdf http://bitsavers.org/pdf/xerox/interpress/610P72582_XSIS_048404_Interpress_Electronic_Printing_Standard_V2.1_198404.pdf

"For example, the creator might choose to represent all coordinates in units of 1/10 printer's point. or 1/720 inch. For an 8.5 X 11 inch page. coordinates would lie in the range 0<=x<=6120, 0<=y<=7920. The transformation from master to ICS would scale by 0.0254/720.

A particularly convenient unit for master coordinates is the mica. equal to.10^-5 meter. For an 8.5x11[inch] page, coordinates lie in the range 0<=x<=21590. 0<=y<=27940, which can be represented by a Short Number (§ 2.5.2). The mica offers sufficient precision for most routine typographic needs.”

On Oct 4, 2022, at 1:45 PM, rmkaplan @.***> wrote:

Tedit has a feature of display a paragraph using fonts with widths that are scaled to what they would be on a corresponding hardcopy stream but with glyphs that are taken from the display font. The result is ugly, often with overlapping characters, but you can see how the hardcopy lines would justify and be laid out.

There is some confusion in the code about how this is implemented, in particular, what scale should be used at various points so that margins and selections take account of the scaled widths.

Since display dimensions (and display fonts) are measured in points and hardcopy fonts are measured with much greater precisition, the general idea is to scale hardcopy widths to points, and to scale the resulting high-precision line-lengths also back to points.

One confusion in the code is what exactly to use as the scaling factor. Postscript fonts appear to have a scaling factor of 100 (i.e. specify widths in 100's of a point). But the code has a lot of places where it uses the constant MICASPERPT (=35.27778, i.e. scaled in terms of 10 microns).

I presume (vaguely remember) that Interpress used micas, and the code therefore has inconsistencies when we go to hardcopy-display mode for postrcript.

The hardcopy menu item in the paragraph menu doesn't let you specify your target file format, I think it just assumed Interpress. But however it is specified (maybe the preferred format of the default printing host, which would now be postscript), it seems that all of the scaling and unscaling should be with respect to the scale of fonts for that device, and that the code should no longer make reference to mica-based constants.

Does that seem right? Alternatively, I suppose we could pick micas or 100ths as our own uniform internal measurement precision, and convert the font-widths when we read them in.

Any thoughts?

— Reply to this email directly, view it on GitHub https://github.com/Interlisp/medley/issues/968, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB6DAWN33YLSZ34NRRAHOSLWBSJPBANCNFSM6AAAAAAQ455BLE. You are receiving this because you are subscribed to this thread.

rmkaplan commented 1 year ago

We could decide on an internal precision for hardcopy-display, and create a different width vector scaling the widths of each hardcopy font to that (say micas) for each font that goes into a particular hardcopy-display. For example, if we picked micas internally, then we would scale the postscript hardcopy-display fonts by micasperpt/100.

The advantage would be that all the other places that scale horizontal dimensions (margins, tabs) could scale by a constant, and the particular mode that is being simulated would not have to propagate through the code. The disadvantage would be the cost of constructing on the fly the fonts for a new device (e.g. POSTSCRIPT-DISPLAY, INTERPRESS-DISPLAY…) that would be needed. Since this hardly ever happens (you have to click “hardcopy” in the paragraph menu), the overhead of creating the font in one place might well be worth the more general simplification. The original code was somewhere in the middle.

On Oct 4, 2022, at 2:02 PM, Nick Briggs @.***> wrote:

There are two coordinate systems in an Interpress master, the master’s coordinates (which are arbitrary and defined by the emitter) and the device coordinates. The renderer translates between them with the information it is given in the master and the information that it knows for the device it is rendering on. PostScript works the same way.

From page 42 of the Interpress standard, http://bitsavers.org/pdf/xerox/interpress/610P72582_XSIS_048404_Interpress_Electronic_Printing_Standard_V2.1_198404.pdf http://bitsavers.org/pdf/xerox/interpress/610P72582_XSIS_048404_Interpress_Electronic_Printing_Standard_V2.1_198404.pdf

"For example, the creator might choose to represent all coordinates in units of 1/10 printer's point. or 1/720 inch. For an 8.5 X 11 inch page. coordinates would lie in the range 0<=x<=6120, 0<=y<=7920. The transformation from master to ICS would scale by 0.0254/720.

A particularly convenient unit for master coordinates is the mica. equal to.10^-5 meter. For an 8.5x11[inch] page, coordinates lie in the range 0<=x<=21590. 0<=y<=27940, which can be represented by a Short Number (§ 2.5.2). The mica offers sufficient precision for most routine typographic needs.”

On Oct 4, 2022, at 1:45 PM, rmkaplan @.***> wrote:

Tedit has a feature of display a paragraph using fonts with widths that are scaled to what they would be on a corresponding hardcopy stream but with glyphs that are taken from the display font. The result is ugly, often with overlapping characters, but you can see how the hardcopy lines would justify and be laid out.

There is some confusion in the code about how this is implemented, in particular, what scale should be used at various points so that margins and selections take account of the scaled widths.

Since display dimensions (and display fonts) are measured in points and hardcopy fonts are measured with much greater precisition, the general idea is to scale hardcopy widths to points, and to scale the resulting high-precision line-lengths also back to points.

One confusion in the code is what exactly to use as the scaling factor. Postscript fonts appear to have a scaling factor of 100 (i.e. specify widths in 100's of a point). But the code has a lot of places where it uses the constant MICASPERPT (=35.27778, i.e. scaled in terms of 10 microns).

I presume (vaguely remember) that Interpress used micas, and the code therefore has inconsistencies when we go to hardcopy-display mode for postrcript.

The hardcopy menu item in the paragraph menu doesn't let you specify your target file format, I think it just assumed Interpress. But however it is specified (maybe the preferred format of the default printing host, which would now be postscript), it seems that all of the scaling and unscaling should be with respect to the scale of fonts for that device, and that the code should no longer make reference to mica-based constants.

Does that seem right? Alternatively, I suppose we could pick micas or 100ths as our own uniform internal measurement precision, and convert the font-widths when we read them in.

Any thoughts?

— Reply to this email directly, view it on GitHub https://github.com/Interlisp/medley/issues/968, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB6DAWN33YLSZ34NRRAHOSLWBSJPBANCNFSM6AAAAAAQ455BLE. You are receiving this because you are subscribed to this thread.

— Reply to this email directly, view it on GitHub https://github.com/Interlisp/medley/issues/968#issuecomment-1267575331, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQSTUJMKTGSQ3NY6KRGHDITWBSLPTANCNFSM6AAAAAAQ455BLE. You are receiving this because you authored the thread.

nbriggs commented 1 year ago

Yes. I think having an internal precision for hardcopy-display makes sense.

The Adobe Font Metric documentation (https://adobe-type-tools.github.io/font-tech-notes/pdfs/5004.AFM_Spec.pdf https://adobe-type-tools.github.io/font-tech-notes/pdfs/5004.AFM_Spec.pdf) says:

3.2 Units of Measurement

All measurements in AFM, AMFM, and ACFM files are given in terms of units equal to 1/1000 of the scale factor (point size) of the font being used. To compute actual sizes in a document (in points; with 72 points = 1 inch), these amounts should be multiplied by (scale factor of font)/1000

I don’t know if we plan on using scalable multi-master fonts, but there’s documentation (in the AFM spec above) on interpolating metric information for multiple master font programs. It has to compute a weighted average of the widths for a character from each of the master designs in the font program. The WeightVector value provides the factors. While it’s trivial math it’s a pile of calculation!

— Nick

On Oct 4, 2022, at 5:57 PM, rmkaplan @.***> wrote:

We could decide on an internal precision for hardcopy-display, and create a different width vector scaling the widths of each hardcopy font to that (say micas) for each font that goes into a particular hardcopy-display. For example, if we picked micas internally, then we would scale the postscript hardcopy-display fonts by micasperpt/100.

The advantage would be that all the other places that scale horizontal dimensions (margins, tabs) could scale by a constant, and the particular mode that is being simulated would not have to propagate through the code. The disadvantage would be the cost of constructing on the fly the fonts for a new device (e.g. POSTSCRIPT-DISPLAY, INTERPRESS-DISPLAY…) that would be needed. Since this hardly ever happens (you have to click “hardcopy” in the paragraph menu), the overhead of creating the font in one place might well be worth the more general simplification. The original code was somewhere in the middle.

On Oct 4, 2022, at 2:02 PM, Nick Briggs @.***> wrote:

There are two coordinate systems in an Interpress master, the master’s coordinates (which are arbitrary and defined by the emitter) and the device coordinates. The renderer translates between them with the information it is given in the master and the information that it knows for the device it is rendering on. PostScript works the same way.

From page 42 of the Interpress standard, http://bitsavers.org/pdf/xerox/interpress/610P72582_XSIS_048404_Interpress_Electronic_Printing_Standard_V2.1_198404.pdf http://bitsavers.org/pdf/xerox/interpress/610P72582_XSIS_048404_Interpress_Electronic_Printing_Standard_V2.1_198404.pdf

"For example, the creator might choose to represent all coordinates in units of 1/10 printer's point. or 1/720 inch. For an 8.5 X 11 inch page. coordinates would lie in the range 0<=x<=6120, 0<=y<=7920. The transformation from master to ICS would scale by 0.0254/720.

A particularly convenient unit for master coordinates is the mica. equal to.10^-5 meter. For an 8.5x11[inch] page, coordinates lie in the range 0<=x<=21590. 0<=y<=27940, which can be represented by a Short Number (§ 2.5.2). The mica offers sufficient precision for most routine typographic needs.”

On Oct 4, 2022, at 1:45 PM, rmkaplan @.***> wrote:

Tedit has a feature of display a paragraph using fonts with widths that are scaled to what they would be on a corresponding hardcopy stream but with glyphs that are taken from the display font. The result is ugly, often with overlapping characters, but you can see how the hardcopy lines would justify and be laid out.

There is some confusion in the code about how this is implemented, in particular, what scale should be used at various points so that margins and selections take account of the scaled widths.

Since display dimensions (and display fonts) are measured in points and hardcopy fonts are measured with much greater precisition, the general idea is to scale hardcopy widths to points, and to scale the resulting high-precision line-lengths also back to points.

One confusion in the code is what exactly to use as the scaling factor. Postscript fonts appear to have a scaling factor of 100 (i.e. specify widths in 100's of a point). But the code has a lot of places where it uses the constant MICASPERPT (=35.27778, i.e. scaled in terms of 10 microns).

I presume (vaguely remember) that Interpress used micas, and the code therefore has inconsistencies when we go to hardcopy-display mode for postrcript.

The hardcopy menu item in the paragraph menu doesn't let you specify your target file format, I think it just assumed Interpress. But however it is specified (maybe the preferred format of the default printing host, which would now be postscript), it seems that all of the scaling and unscaling should be with respect to the scale of fonts for that device, and that the code should no longer make reference to mica-based constants.

Does that seem right? Alternatively, I suppose we could pick micas or 100ths as our own uniform internal measurement precision, and convert the font-widths when we read them in.

Any thoughts?

— Reply to this email directly, view it on GitHub https://github.com/Interlisp/medley/issues/968, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB6DAWN33YLSZ34NRRAHOSLWBSJPBANCNFSM6AAAAAAQ455BLE. You are receiving this because you are subscribed to this thread.

— Reply to this email directly, view it on GitHub https://github.com/Interlisp/medley/issues/968#issuecomment-1267575331, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQSTUJMKTGSQ3NY6KRGHDITWBSLPTANCNFSM6AAAAAAQ455BLE. You are receiving this because you authored the thread.

— Reply to this email directly, view it on GitHub https://github.com/Interlisp/medley/issues/968#issuecomment-1267780662, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB6DAWMYIMKJUSW2RNJVEE3WBTG7ZANCNFSM6AAAAAAQ455BLE. You are receiving this because you commented.

D-Van-Buer commented 1 year ago

Having actually programmed to generate Postscript (and a little bit of Interpress), I have a few comments. Both languages use a very similar programming model. There is a key low level difference in measuring things. Postscript starts out in units of a Point, while Interpress uses 1 meter (I once had an unexpected output page that was a tiny corner of a letter that was several meters high).
Most programs that generate these languages start each page by scaling and transforming the coordinate system to something convenient to the generator including rotating the world for landscape vs portrait. I have not inspected the Tedit code to see it's internal model, but it's likely tied the the display system which back then was treated as a pixel is 1 point high and wide. In a quick look at POSTSCRIPTSTREAM I get the impression that it's expecting (at least some) measures in 0.01 point units (\PS.SCALE0 is 100) which is smaller than the laser spot width on any printer. So I would speculate that all hardcopy interface code is expected to use a device-independent metric and lets the drivers deal with it (e.g. in the page setup code). If the printer font interface in Medley isn't already independent of font metric internals, it should be.