Open caoimghgin opened 1 year ago
Is there a different solution you have in mind to replace it with or a different relative unit that is platform-agnostic?
They have provided a unit for a absolute values (px) and a unit for a relative values (rem) in the spec and have noted that a translation tool will need to provide appropriate conversions when preparing tokens for a specific platform.
There is definitely value to having absolute and relative values available for dimensions so if there is another direction that is better, it'd be great look into it.
IF they provided absolute units in specs, a fontSize: '16px' output, I'm all good. Please close request.
REM and EM are appropriate values, while the pixel is not:
Apple list its default font sizes on iOS. https://developer.apple.com/design/human-interface-guidelines/typography
Body text, on default, refers to 17pt, which is one rem; depending on the user's preferences, the value for one rem or one em changes for accessibility.
Android: Material design has a conversion table in their guidelines on moving from REM to platform-specific values. https://m3.material.io/styles/typography/type-scale-tokens
To use pixel values in the case of design tokens is the wrong approach these days. REM and EM values, on the other hand, are quasi-the-platform-agnostic units that most design systems also relate to.
A traditional px (or point) is the appropriate value, where REM/EM are derived values that are driven by the idea of a CSS Pixel.
I believed 1pt was always equal to 1px (1/72") because iOS specifies typography in points, and that is how print works. I am confirmed in my belief when I set spec typography as size 17 (in Sketch, Figma, or Illustrator) as a developer handoff, I can screenshot the render of that design on my iOS device and expect it to overlay perfectly on my designs. Even though Illustrator says the typography is in points and documentation in Figma says it's in px, both line up perfectly.
The problem occurs only on the web, with the idea of a CSS Pixel. This strange concept gives us math such as...
17pt == 22.66px == 1.4167rem
https://websemantics.uk/tools/font-size-conversion-pixel-point-em-rem-percent/
This ASSUMES a root font-size of 16px, and as we know, that is a variable that could be changed to anything in the web dev environment.
Therefore, we should consider using the unit specification that matches the input defined by the designer in a design application as 'the standard', and leave the job to programmers to transform to any platform-specific values they like via Style Dictionary transforms..
As I consider this more, it seems the correct way to express typography size as defined in design programs such as Sketch, Figma, AdobeXD, and Adobe Illustrator is the unit of POINTS, or 1/72".
In a traditional sense, the units of pixels/points were considered identical, but this is no longer true. Consider iOS Retina displays, and Android 1x, 2x, 5x. For the web, computer monitors will vary from 1/96 +/- 20px. Pixels do not have an intrinsic physical size, whereas points do.
To complicate things even more, a CSS Pixel defines a pixel's physical size to 1/96, which I find curious because I have never typed 22.66 in an application to get the visual appearance of 17pt size font. Personally, it is difficult to believe that 22.66px on screen (and even considering a standard observer distance of X viewing distance) is representative of the physical size of 17pt as printed on paper.
The ultimate pass/fail is when the coded implementation matches designs created in a design program. At normal user zoom. Design programs, as mentioned above, respect the 1/72" standard for both pixels and points. So...
Rather than px, the most correct absolute value would be pt, taking care to say NOT A CSS PIXEL POINT but a standard Point. This is the stuff that burns my brain.
ADDENDUM:
A better way to say it is when it comes to typography, a 'px' and a 'pt' are indeed identical in size; both are 1/72" of an inch. To verify, type 'gh' at 100px size as Times New Roman in Figma, export as a PNG and import into Illustrator. Measure the height (from top of h to bottom of g) and you'll find it's close to 100px. Now, type 'gh' in Illustrator and set to 100pt, and once again same size. Next, do the same but from Code Pen. Screenshot and import into Illustrator. Once again, same size. I HOPE that puts to rest the idea that 1/96 is a 'thing'. It only manifests when using the pt unit on web.
If I may, I would like to address the elephant in the room; Why do we have to force a unit value in the spec? Maybe I'm wrong here, but for the spec to truly be agnostic, the consumer of the spec dictates which units to use, right?
@phun-ky It's 'nice' to know what the value represents before we transform it into the unit we need for the platform accurately, but you make an excellent point (all puns intended).
I'm perfectly happy with the unitless value of fontSize: 17
as exported by Tokens Studio and Design Tokens. I've never had a problem.
But let's imagine if a Figma plugin did provide the option between different units on export. What functional benefit have we provided? Now, the Style Dictionary programmer must check for the existence of the unit flag and respond appropriately if it is set to rem, but no other unit (because only two units are proposed). And, to transform correctly, the programmer needs to reference a global font-size, which I suspect the specs do not provide?
Well, I've experienced that you can have more than one unit in a component. What if the spec dictated a default unit, and then add option to have that overridden for special cases? Pick the unit that is most common or easiest to convert from?
Wanted to chime in and note that Tokens Studio has a base font size feature, by default it's 16px, and all tokens written in rem would reference that. It can be changed to something else like 24px. https://docs.tokens.studio/tokens/settings#base-font-size
I think that's an excellent feature.
I agree with "the consumer of the spec dictates which units to use". If they want to write 16px
, that's fine. If they write 1rem
, that's also ok. I would like the spec to provide base font size though.
Please research rem vs. px in the context of accessibility. Dropping rem/em from this specification would set back efforts for a more open and accessible web platform by decades...
Working with Frontend since 98, and accessibility since validation and section 508 came around, you cannot exclusively have rem/em. This article sums it up https://www.joshwcomeau.com/css/surprising-truth-about-pixels-and-accessibility/
@phun-ky Please re-read my comment, I did not say rem should be the only unit.
It is correct that you need multiple units, one of those must be rem, or a platform agnostic equivalent.
@romainmenke I'm sure neither phun-ky nor myself were advocating the removal of rem from the web specifications or impacting WCAG accessibility. Everyone understands we need multiple units, especially if they have programmed Style Dictionary for multiple platforms.
The question remains, should rem be embedded in W3C Design Tokens specification?
In the future, I'd imagine nobody would manually type W3C Design Token JSON files but depend on Figma Plugins to create the files for them. Because Figma edit seats are somewhat expensive, developers typically do not have edit access to Figma libraries, so they are unable to run plugins. Therefore designers will be outputting JSON for the developers.
I'm a 'Design Technologist', so I'm concerned about areas of responsibility as designers and engineers interact with each other in our Design System. Simply put, I would not put the responsibility of choice of REM vs PX in the hands of designers and certainly would not ask the designer to choose the base font-size and unit.
Though we'd like a world where we could output JSON from Figma, run it through a generic install of Style Dictionary and get the result we need, that won't happen. We'll always need to customize Style Dictionary, so we should let it do what it does best. Keeping the W3C Design Token Specification simple helps SD do its job.
Instead of adding an explicit unit to the spec, consider standardizing on the current unitless numeric value that indicates px (or traditional 1/72" point). I believe adding units puts the responsibility of the definition in the wrong hands and IMHO is an optimization that adds unnecessary complexity.
The entire purpose of this specification is to create a standardized interface so that specific values can flow freely between tools (either design or translation tools).
If it becomes a requirement for developers to preprocess and manually rewrite values so that they have the correct unit, then I don't see the point of this specification :)
Designers should learn when to use relative and when to use absolute units imho. And design tools should embrace that both are needed.
I see this as a problem that needs to be solved for design tokens to fulfill their intended purpose.
It comes down to knowing the reason why tools such as Theo or Style Dictionary exist, the challenge of auto-exporting tokens from Figma via plugins, knowing the core mission of W3C Design Tokens, and an understanding of what a standard is able to accomplish.
THEO/STYLE DICTIONARY: Ability to write a single platform-agnostic JSON file containing all key/value pairs (tokens) that can be transformed into platform-specific variables for engineers to consume. While engineers must manually type a JSON file, they would not need to write CSS, SCSS, TypeScript, CSS in JS, Objective-C, Swift, Android, or Compose. In short, write once and output for multiple platforms. If you support one platform only, the JSON file becomes an extra step and is not required.
FIGMA EXPORT OF TOKENS: The promise of Figma export of tokens is to remove the need for engineers to manually type the JSON files. However, since the output of exported tokens only matched the structure defined in Figma instead of the CTI structure Danny Banks defined for Style Dictionary, programmers would either need to customize Style Dictionary or continue to manually type the JSON files.
W3C DESIGN TOKENS MISSION: "The Design Tokens Community Group's goal is to provide standards upon which products and design tools can rely for sharing stylistic pieces of a design system at scale.". So long as we have standards, we can transform into anything.
STANDARDS: The JSON file (no matter how detailed) cannot be, could never be, the final document that can do the transforms for us. Instead, the file needs to contain sufficient information so we are able to do anything - reliably, confidently, and at scale. Programming still needs to happen. No matter if it's Theo, Style Dictionary or a brand-new-thing, we simply need to know we can rely on something called X will give us the same value unit.
Standardization of Design Tokens is great and solves a lot of problems. But, we need to understand what problems are being solved, why they are being solved, what the benefits are, what the scope of success is, and finally, what is outside of scope.
If you think of design tokens as expressions of design intent, then there is a big difference between the choice of a relative unit like rem
and a ln absolute one like px
. The former is saying "I intend this size to be relative to the user's choice of default font size", the latter is saying "I want this size to be the same, regardless of the user's settings".
I believe it's important to be able to express both of those intents, which is why the DTCG spec allows both.
We've borrowed CSS's unit names as we felt those are likely to be familiar to many folks working in and around design systems. I suppose we could have adopted Android's sp
for this purpose instead, but I suspect less people are familiar with that. Or we could have invented our own, but then we'd probably have ended up writing stuff like "the DTCG flibble
unit is a multiple of the user's preferred font size - just like rem
in CSS" 😜
As for why have units at all, think of the px
and rem
units in the DTCG spec as a means of expressing the relative vs absolute dimension intent I outlined above.
In order for translation tools like StyleDictionary or Cobalt to "know" how to convert a token value into the appropriate platform-specific value and syntax, they need that info. As others have pointed out, it's not just the Web that has this concept. For example, Android has sp
which is essentially 1/16th of a rem
.
Sadly design tools like Figma do not currently have this concept, but I don't believe that's a reason to limit the spec's expressiveness (if anything, I'd hope it might nudge design tool makers to adding support for something akin to rem
). I'm actually more concerned about whether or not we ought to add more relative units (E.g. %
, vw
, vh
, ch
...) in the future, so as not to limit what design intents people can express in the format.
It's also worth noting that design tools (or any tools that might create or manipulate DTCG files) don't necessarily need to expose the "raw" DTCG values to their users. Just as an export tool like StyleDictionary might read a DTCG rem
value and convert it to, say, an equivalent sp
value. A tool like Figma could read a DTCG rem
value and display it to users in some other way. It's also reasonable (though not ideal) for a lossy rem
to px
conversion to occur in tools or platforms that lack the concept of sizes that are relative to the default font size.
In the future, I'd imagine nobody would manually type W3C Design Token JSON files but depend on Figma Plugins to create the files for them. Because Figma edit seats are somewhat expensive, developers typically do not have edit access to Figma libraries, so they are unable to run plugins. Therefore designers will be outputting JSON for the developers
While I'm sure many teams will operate in that way, we can't assume that designers are always the exclusive "owners" of design tokens, or that there is just a one-way flow of information from design tool to code. Tools like ZeroHeight, Supernova, Interplay and others already let you create and edit tokens, which can then be synched back to Figma as well as exported to code. Other tools to visualise, organise and manipulate tokens are emerging too, such as Token Studio's Flow tool. My hope is that the DTCG format will one day allow teams to pick and mix any combination of such tools and construct whatever design token flows they want.
The "source of truth" for a team's design tokens thereofre doesn't have to be Figma. It could be a DTCG file in a git repo (which might be edited by hand or via some dedicated token editor tool), or some kind of token management tool that can import & export DTCG files. In that case, that is where are team can capture their absolute vs relative design intents - as long as we retain the rem
unit 😜. A design tool like Figma could then read from that and, until it has some native equivalent to rem
, do a lossy conversion to px (yes, I know Figma doesn't display a unit, but let's face it those numbers are effectively absolute px (web) /dp (Android) / pt (iOS) values).
When implementing designs, a developer would hopefully use the token names used in the design as their guide and reference the respective variables in their code which would be relative or absolute as needed.
Long story short, I'm very much in favour of keeping the rem
unit (and units in general) in the DTCG format. Unless there are strong objections, I think we should close this issue.
I wasn't 100% opinionated when I created this issue, but as the thread has evolved, I must admit I feel strongly against mix/match of units in the spec. As a programmer, I believe in the value of separation of responsibility. The W3C specification for tokens outputs key/values in a normalized format that we can reliably read. The transform tool (Theo/Style Dictionary) is responsible for translating the output to platform-specific code. Mixing those two concepts only adds to a list of if/else statements I'd much rather avoid. Furthermore, if the W3C spec does not include the base size, then I'm left guessing what 1rem means (is it 16px, or something else). I hope I've made my points clear, and thank everyone for participating.
A few points that may be of help.
For example, Android has sp which is essentially 1/16th of a rem
Which gets to the crux of the issue. A rem is an expression of another variable. On web, it's the HTML root font-size. Which, could be set to anything. Most use 16px, but some advocate for '62.5%' to make the units simpler in code. Therefore, REM is not a size in and of itself. Rather, it's closer to say an Android sp is 1/72", much like an iOS px unit. Both inherit from the concept of a point size from print.
In order for translation tools like StyleDictionary or Cobalt to "know" how to convert a token value into the appropriate platform-specific value and syntax
If all is px, we know how to convert. Easy peasy.
The "source of truth" for a team's design tokens therefore doesn't have to be Figma.
Except, Figma IS the source of truth. With plugins, we export from Figma (or Sketch, or whatever), and we publish to repos of any sort.
Except, Figma IS the source of truth. With plugins, we export from Figma (or Sketch, or whatever), and we publish to repos of any sort.
I have to disagree on this. It’s a two-way street, especially if you are using Tokens Studio with Github setup, both designers and developers can commit changes, as long as they are on the DS team and have proper permission.
Tokens Studio is a plugin that exports tokens from Figma, but there are others. Personally speaking, Tokens Studio is slow, buggy, and an unintuitive solution – doing too much at once. However, love or hate Figma Tokens Studio plugin, it seems clear the mission of the W3C Design Tokens spec is to be agnostic to any one technology or any plugin. Certainly unopinionated on process and workflow.
However, as Design System professionals, we must depend on design truth generated from the design library files as defined by brand designers (published to N platforms). Otherwise, we have a much bigger problem which is likely outside of the scope of this initiative. In other words, the solution must be linear for scalability, not bi-directional.
:thinking:
I agree with @caoimghgin saying that it is out of scope, at least for this issue. A specification for design tokens (which is a part of something bigger, i.e. Design System), should not dictate where the source of truth lies.
Came here to open the same discussion. And I think this is an important issue to solve and misleading for people approaching to design tokens.
We know design tokens have core principles, and one of them is “platform agnostic”.
Most examples in the spec use CSS-ready raw values, which is misleading since doesn't communicate the concept of “transforming the token for different platforms”.
For example, rem
may be supported by iOS and Android, but you still need an important part to calculate the rem value, the base font-size, which is different in each platform. So a token saying “1rem” will be X on iOS and Y on the web. This rem calculation is part of the conversion and so outside the spec.
Now, most tools use absolute values to do design (px), and that unit should be used as raw values for tokens, then converted by authors providing the additional info.
Eg. font-size-20: 20
(without unit) from token can be transformed into:
20px
Xrem
20pt
(on iOS pt are scaled based on screen density and still the best unit)20dp
(density-independent pixel)This discussion on the use of design tokens, their units, and the surrounding tooling has been incredibly insightful. Here are my thoughts:
rem
and px
, or if they should lack such units, leaving the responsibility of conversion to the tooling.I've suggested setting the root font size to 10px
. This simplifies the correlation between the rem
and px
units, mainly targeting developer convenience and making conversions more intuitive.
On that note:
This approach offers a pragmatic solution for web development. The ease of converting rem
values into pixel values by shifting a decimal point is attractive. However, considerations remain:
Flexibility: While perfect for the web, this system might have challenges on platforms like iOS and Android, which standardize on pt
and dp
units. The crux of the matter is whether design tokens should be versatile across platforms or if tool-specific conversions should bear that weight. Or, is there a universal standard, like the one I'm suggesting?
Accessibility: The draw towards rem
stems from its adaptability to user preferences, such as their default font size. Assigning a fixed pixel value as the root might impede this accessibility feature. Nonetheless, this can be balanced if other accessibility aspects are championed by developers.
Adoption: The rationale behind this system is straightforward. However, widespread adoption might face challenges. Many developers and designers are accustomed to the default 16px root size. Changing this established norm could be challenging.
Recently, Figma has introduced local Variables, their own take on design tokens. This addition can be seen as a game changer, as it brings the management of design tokens directly into one of the most popular design tools. It simplifies the workflow, reducing the need for external tools or plugins for creating and managing design tokens.
However, one must consider whether this creates a new silo, potentially leading to inconsistencies if other tools or platforms are involved in the workflow. With that in mind, it is crucial to establish a clear "source of truth" and consider how this new feature integrates with the existing design and development processes.
In conclusion, my system presents an optimized route for web applications. The broader challenge is finding a middle ground. Should design tokens be platform-specific or universally neutral? While my proposal is tailored for web contexts, the dilemma remains: should we adopt a universally accepted standard for all platforms or place conversion responsibilities squarely on the tools?
What do we gain by having to explicitly include the unit, if the goal is to be platform/tool agnostic?
I'm actually more concerned about whether or not we ought to add more relative units (E.g. %, vw, vh, ch...) in the future, so as not to limit what design intents people can express in the format.
Please also consider variable units https://drafts.csswg.org/css-variables-2/#variable-units. Maybe my org wants to use a custom flibble
unit in the future.
As it stands, requiring "px" or "rem" prevents me from being able to use maths like
"value": "{dimension.xs} * {dimension.scale}"
Preventing me from doing that might be the feature, not a bug, but without understanding why I need to be explicit with a value, it's frustrating.
What do we gain by having to explicitly include the unit, if the goal is to be platform/tool agnostic?
I'm actually more concerned about whether or not we ought to add more relative units (E.g. %, vw, vh, ch...) in the future, so as not to limit what design intents people can express in the format.
Please also consider variable units https://drafts.csswg.org/css-variables-2/#variable-units. Maybe my org wants to use a custom
flibble
unit in the future.As it stands, requiring "px" or "rem" prevents me from being able to use maths like
"value": "{dimension.xs} * {dimension.scale}"
Preventing me from doing that might be the feature, not a bug, but without understanding why I need to be explicit with a value, it's frustrating.
That's why tokens need to be transformed based on the destination platforms supported. Any kind of unit should be banned because they are always platform specific and outside the scope of the spec. And sadly, I would never follow a spec that imposes token units I don't need. If don't build for the web, I don't care about rem, xh, ch, etc...
To follow up on previous posts, I believe fontSize: '16px'
(which I mistakenly wrote earlier) should be output as a numeric 'fontSize: 16' in the tokens.json. The 'px' (or 1/72") unit is implied, as is 'standard' today. As @equinusocio, @markacianfrani, and others mention, this makes it easier for the token transformer to calculate the input into the platform-specific (and even brand-specific) output into reliable results. Designers output the tokens, but the engineering team(s) decides the units.
Sorry for pinging an old thread, but I’ve just put up a proposal where’d I’d suggest we NOT make this change: #244. The reasons are outlined there, but TL;DR, after reading this thread:
This is NOT an official decision yet! This is only a way to source feedback, and push this forward. Any/all feedback would be welcome. Thanks all for discussing this change 🙏
I've noticed font size values are spec'd as REM for W3C. Design token values should be platform agnostic. Native apps such as Android and iOS have no such concept, and the declaration of REM must point to another base value for rem to be meaningful.