My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.
I still have no idea what was going on, and most makefiles out there seem to be obscenely complex and simply parse the output and run the same commands again if a certain set of errors occurred.
By coincidence, this is the basic way to compile latex.
Seemingly identical action, from the performer's point of view, is performed in a different environment each time it is repeated. Unless you are the Laplace's Daemon, you can't say for sure you repeat the same action over and over because the environment could change in the meantime in an unimaginable way, and that could influence the outcome.
I just really hate that quote because it is detached from reality.
However, with LaTeX, the output of the first run is often an input to the second run, so you get notably different results if you only compile it once vs. compiling twice. When I last wrote LaTeX about ten years ago, I usually encountered this with page numbers and tables of context, since the page numbers couldn't be determined until the layout was complete. So the first pass would get the bulk of the layout and content in place, and then the second pass would do it all again, but this time with real page numbers. You would never expect to see something like this in a modern compiler, at least not in a way that's visible to the user.
(That said, it's been ten years, and I never compiled anything as long or complex as a PhD thesis, so I could be wrong about why you have to compile twice.)
It was 25 years ago, though, but apparently this part did not change.
This said, I was at least sure that I would get an excellent result and not be like my friend who used MS Word and one day his file was "locked". He could not add a letter to it and had to retype everything.
Compared to that my concern about where a figure would land in the final document was nothing.
I always feel like I’m doing something wrong when I have to deal with LaTeX and lose hours to fighting with the tooling. Even with a clean install on a new machine it feels like something fails to work.
The last time I had to change a document I had to go through what felt like 100 different search results of people with the same issue before I found one where there was a resolution and it was completely obscure. I tried to help out by reposting the answer to a couple other locations, but I was so exhausted that I swore off LaTeX for any future work unless absolutely unavoidable.
It reminds me a little bit of the problem of Linux distributions. Linux is supposed to be a system with the bazaar model instead of the cathedral model. Except what you actually end up with is that each distribution becomes it's own cathedral, because building a whole system now requires major decisions to be made. LaTeX class files feel like the same thing.
That also solves the problem of having to install locally vairous extension packages or fonts - all is already there, and after writing a paper you may submit it directly to some conferences or journals from that Web GUI instead of having to email it or upload to a third site.
Latexmk is one way to address this problem. A good IDE like AUCTeX can also figure out how many times the compiler should be invoked.
Good IDEs will also provide other invaluable assistance, like SyncTeX (jumping from source to exact point at PDF, and back).
That's certainly part of it, but any typesetting program will need multiple passes to properly handle tables of contents—you can't know a section's page number until you've compiled everything before that section (including the table of contents), but adding a new section to the contents could push everything ahead by another page. The only unique thing about LaTeX here is that it directly exposes these multiple passes to the user.
if [ -z "$(find . -name "*.bib" -print0)" ]; then
# Just two runs, to cover TOC building, etc.
pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
pdflatex -interaction=nonstopmode "$SOURCE_FILE"
else
pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
bibtex "$SOURCE_FILE" && \
pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
pdflatex -interaction=nonstopmode "$SOURCE_FILE"
fi
So I guess if you're using bibtex, then you need to run it three times, but otherwise only twice?This is to say... I'm glad those days are gone.
LaTeX will usually tell you by including a warning in the output ("LaTeX Warning: Label(s) may have changed. Rerun to get cross-references right."), which no one reads, because it is so verbose. Not having that warning is not a guarantee that it's now stable either, so our Makefile actually compares the PDF files minus variable bytes like timestamps to know whether the build converged.
First is based on Todd C. Miller's Latex Resume Template:
- https://typst.app/project/rDUHMUg5vxl4jQ5q2grGPY
Second is a Enduring Power of Attorney:
- https://typst.app/project/rs9ZgGLhgM7iPvFs7PQv5O
Third a will:
Admittedly, not the most renowned or most known journals but you have to start somewhere.
Journals justify their fees by claiming its for typesetting, but all they are really doing is adding extra work to nit pick bibliography formats and so on (see the comments in this article about sentence case). Nobody cares about that. I don't think anyone even reads "journals" any more (except maybe Nature/Science etc.). They mostly just read individual papers and then there's no consistency to maintain.
In a sane world journals would accept PDFs. They would check that the format roughly matches what they expect but not insist on doing the type setting themselves.
Oh well, maybe one day.
On consistency, what the journals provide is some level of QA (how much is a function of field and journal, rather than the what is charges), and the template is the journal's brand, so both the authors and journals benefit from the style (I can tell the difference between the different (all similar quality) journals in my field at a glance by the style).
It's also worth noting that there's a whole much of metadata that needs to be collected (whether you agree with it or not, funders require it), so a PDF isn't going to cut it here either.
Using PDF as an input format would make editing and typesetting practically impossible. Not that I haven't seen volumes where publishers did that but the results are abysmal and in my experience that only occurred with local "grey literature" like really crappy conference proceedings edited in an institute.
But they don't do any editing or typesetting. They say "use our template" and "the author's second initial needs to be italic in citations". That's my whole point.
Because they used to actually be doing that. Historically, science journals were pay-to-play because the journal had to typeset your document and print it. But with the advent of computers, they had to pivot while still retaining their revenue streams.
Perhaps the hardest part has been relearning the syntax for math notation; Typst has some interesting opinions in this space.
I took a hiatus from LaTeX (got my PhD more than a decade ago). I used to know TikZ commands by heart, and I used to write sophisticated preambles (lots of \newcommand). I still remember LaTeX math notation (it's in my muscle memory, and it's used everywhere including in Markdown), but I'd forgotten all the other stuff.
Claude Code, amazingly, knows all that other stuff. I just tell it what I want and it gets 95% of the way there in 1-2 shots.
Not only that, it can figure out the error messages. The biggest pain in the neck with LaTeX is figuring out what went wrong. With Claude, that's not such a big issue.
One of the best things about Typst is that most tasks are very simple. Compared to the reams of Latex BS I was replacing, building my book with Typst is momumentally simpler.
1. Latex is sufficient for all document publishing needs. E.g. converting Latex to HTML is bad to non-existent, while Typst has HTML export.
2. LLMs are sufficient for solving all problems one can encounter.
3. Things that are easier for humans are not also easier for LLMs.
4. New releases of LLMs will not learn more about Typst
At the end of the day I'm not trying to migrate anyone. Use whatever you feel is best. For my use cases I'm convinced Typst is a better option than Latex.
Yeah the syntax of typist looks nice but that's just not a big advantage anymore.
.. because a new language might be better?
But moving forward it’ll be harder to tell if any given new language is better than existing alternatives. LLMs burden their users with an almost insurmountable status quo bias.
Consider the counterfactual of LLMs being available in the 1990s, trained mainly on the world's C code. Perhaps we would still be exclusively writing C today for new languages' code could not been synthesized as easily or conveniently. It's not just about Typst or typesetting specifically but programming language design in general and that improvements are becoming much harder to push through.
I'm not actually sure that would be a bad thing? All the reasons that immediately come to mind to move away from C have to do with ergonomics and safety, the latter largely being a product of the former IMO. If an LLM can ingest my entire codebase and do 90% of the work to get me to the changes I need doesn't that obviate the majority of the motivation to change languages in the first place?
If it still programs in a human readable language, that means people need to review the code, at least from times to times. And it's much easier to review modern languages than C.
I say that as someone who uses a tricked-out Vim for my own LaTeX workflow, and VS Code for several programming languages.
It also has first class support for unicode (as does LaTeX via some packages) which if combined with a suitable keyboard layout makes both writing and reading math source code infinitely more pleasant :)
I really like the simple syntax that Typst provides. It would be much harder for the PMs to edit the templates if we went with other solutions, such as wkhtmltopdf.
We also looked into other document generation services that provide a WYSIWYG interface, and they are all quite expensive and often lack advanced scripting capabilities.
The team I'm currently working with are using Gotenberg for things which we can afford to take a little while, and C#/Skia for things which need to be reasonably quick.
The main issue I found with an HTML-based approach is that browsers are not designed for papers. It would be very challenging, but still possible [1], to customize the page layout, headers, and footers. Nonetheless, we have even more advanced use cases that only Typst/LaTex could cater to, such as displaying the table header of a table that spans multiple pages on every page.
[1] https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_media_q...
1. They should have carried forward the latex standard as-is for math, instead of getting rid of the backslash escape sequence, etc.
2. There is no way to share a variable across a file's scope - so can't have a setting that is shared across files - not even with state variables.
Other than this, typst is solid, and with the neovim editor and tinymist lsp, is great to write with.
$
ZZ &= { ..., -1, 0, 1, ... } \
QQ &= { p/q : p, q in ZZ }
$
$
a = cases(
0 & quad x <= 0,
mat(1, 2; 3, 4) vec(x, y) & quad x > 0
)
$
instead of \begin{align*}
\mathbb{Z} &= \{ \dots, -1, 0, 1, \dots \}, \\
\mathbb{Q} &= \left\{ \frac{p}{q} : p, q \in \mathbb{Z} \right\}
\end{align*}
\[
a = \begin{cases}
0 & \quad x \leq 0, \\
\begin{pmatrix}1 & 2\\ 3 & 4\end{pmatrix}
\begin{pmatrix}5\\6\end{pmatrix} & \quad x > 0
\end{cases}
\]
Regarding point 2: you can put your settings in a file `settings.typ` and import it from multiple files.Let's say I have 3 flavors of settings and 10 different typ files - normally I'd just have 3 flavors of top.typ (top1.typ, top2.typ, top3.typ) with the correct settings for each flavor with settings proagated to all 10 files. Compiling top1/top2/top3 would then create flavor1.pdf, flavor2.pdf, and flavor3.pdf
Now how do I do it with settings1.typ, settings2.typ and settings3.typ? I have to go into the 10 different files and include the appropriate settings file! Or employ hacks like creating a common settings.typ using bash in the Makefile and including the common settings.typ in the 10 different files.
Edit: This is an actual use case - I'm helping with a resume, and have 3 different resume styles - a resume, a cv, and a timeline - and different files like education, work experience, honors, awards, publications, projects, etc and the level of detail, style, and what is included or not in each is controlled by which resume style is active. In latex I did this using \newcommand and the ifthenelse package.
In typst, I have had to resort to passing these global settings as arguments to functions spread across these different files, so each resume item (function) instantiated from the top file has a bunch of parameters like detail_level = 1, audited_courses = true, prefix_year = false, event_byline = true, include_url = true, etc., which make the functions unweildy.
Alternatively, you can pass global settings at build time with `typst c --input name=value`
Maybe I misunderstood though, if you can link to an actual example (gist or something) I'd be happy to try and give a concrete solution.
Yes, but each included file (like education.typ, publications.typ, etc) should also get these settings propagated from top - which typst doesn't allow - the appropriate settings need to be included in each of these files.
> you can pass global settings at build time with `typst c --input name=value`
This is something I did not know - will check.
Or you can import settings.typ in all files that need it (education.typ, etc.).
What doesn't work is to have a file like top.typ contain
import "settings.typ": *
import "education.typ": *
and hope that this will make settings available in education.typ. Because each .typ file is "pure" in the sense that it only knows the variables/functions that are defined in the file, or imported. This way you don't have a file magically affecting the bindings available in another file, which is nice.It's true there are cases where you'd like something like the above. Currently you can do something like that using states and context (basically putting the "settings" into the document and retrieving that) but it's not so nice. In the future the plan is to make this nicer by allowing custom type definitions (and having show rules and set rules work with them as they work with built-in types).
It’s not like keeping the syntax would really gain typst anything besides folks not having learn new things.
I’ve been able to avoid LaTeX. At uni, I went for org-mode -> LaTeX, which was OK except when my .emacs file was filling up with LaTeX stuff to make random stuff work. To be honest, that means I probably can’t even compile it again if I wanted to.
Typst has been awesome (always ran into LaTeX just being horribly inconsistent when layout stuff) when I’ve used it. Hope it continues.
However its handling of introspection and convergence gives me a bad feeling.
https://docs.racket-lang.org/scribble/getting-started.html#%...
Although it doesn't look like Scheme, it has the full power of Scheme.
Sure, but in order to iterate you won’t have to compile the whole document but can just keep the chapter you are working on by structuring it with \includes
I use LaTeX as a tool for layout of books to print for hobby bookbinding and my current project - a 3 megabyte, 500k word beast of a novel - only takes around 10 seconds to compile.
I cant imagine what the friend of the author here had going on in his paper such that his compile times took such a hit. Required use of specific LaTeX libraries dictated by the journals he was submitting to that were written inefficiently? specific LaTeX features or packages that end up hitting significantly slower codepaths?
Makes me wonder if its not LaTeX itself that is the speed issue but instead the very maturity of the ecosystem that it has as advantage over Typst. It could entirely be possible that once Typst has the wide berth of features and functionality available through its ecosystem, that it would become just as easy to fall into compile time tarpits.
For further reference, a single-pass compilation of my thesis currently takes 25 seconds, and multiple passes are of course needed if the layout/bibliography changes. I ended up setting up TeXstudio to always compile only once for the preview, and then run the full N complications for the final build. That plus liberal use of \includeonly made compile times not that much of an issue
I don't use latex anymore and I don't have a use case for typst, so I'm not currently using it, but I follow the advancements from time to time, and I have to disagree with the advisor.
Typst is perfectly fine for replacing latex in almost any place that doesn't require the latex source. The other case is because tthe ecosystem is much smaller so if you need a specific extension that does not exist or is not trivial to implement you'll be out of luck, and you'll be stuck with latex.
I admit that typst is quite promising and given enough time, its adoption will increase quite a lot.
Typst appears to be a mix of open source and closed source; the general model here tends to be neglecting the open source part and implementing critical features in the closed source portion. Which is to say, it's unlikely to live beyond the company itself.
I made no claims about any mixes or claims about LaTeX.
> If you want help installing and maintaining Overleaf in your lab or workplace, we offer an officially supported version called Overleaf Server Pro. It also includes more features for security (SSO with LDAP or SAML), administration and collaboration (e.g. tracked changes). Find out more!
typst, the project, is not by any means a "mix" of open and closed, even if typst, the company, is. indeed, the most thorough LSP implementation available (tinymist) is not only open source but a community project. for another funny example see typstify, a paid typst editor not affiliated with the company. [2]
[1]: https://github.com/typst/typst/issues/145#issuecomment-17531...
Imo the situation is more like if overleaf were also the people who made the LaTeX project originally.
I think the only possible issue with the typst org dying (assuming after the full 1.0 version so it's mostly maintenance) is that packages are automatically downloaded from the typst site, but an open repo can trivially be made considering that the set of packages used is just from a open source git repo and the closed source site just hosts tar.gz files of the folders in the repo. Not a big deal I think.
The only way they can continue to gain traction is if they never ever in any way lock people to the web app. Documents must be portable, it's part of why someone would want typst anyways.
I do not see a future where this happens, and if it does it will be because the typst org has changed hands and is also no longer particularly relevant to the future of typst the language.
In principle, having a reliable source of funding for typst is great. However, as a journal this would make me hesitant: what if down the road some essential features become subscription-only?
It is a concern that there is a single company doing most of the development, but there is quite a bit of community involvement so I don't think it is an immediate concern
Like which critical features, for example?
And LaTeX has this for free? It's separated concerns, I think the analogy is Overleaf and LaTeX but just happened to be made by the same group of folks, it doesn't have to go down the monetization-at-the-cost-of-your-user route.
Yes, Overleaf is both free-as-in-beer [0] and free-as-in-speech [1]. The OSS version is pretty easy to self-host, but it's missing quite a few features from the paid version. I still prefer compiling from the command-line for most of my documents, but I run the self-hosted version for collaboration.
That sounds like a sign that overleaf is struggling, that they had to make that change.
And Typst is more generous there, you can collaborate 3 people with no problem.
Yup. You used to be able to share projects with unlimited people via link sharing, but they annoyingly got rid of that last year [0]. And Overleaf's cheapest plan is still more expensive than a basic VPS, so it's actually cheaper to self-host (which is what I'm doing [1]).
> That sounds like a sign that overleaf is struggling, that they had to make that change.
Either struggling or realized that they have a captive audience—if your professor requires assignments to be typeset with LaTeX and assigns group projects, there aren't really any other options.
[0] https://www.overleaf.com/blog/changes-to-project-sharing
Some people can't grasp that Overleaf may be a really good product that has to compete with hundreds of both free and fully open source alternatives.
> “Free software” means software that respects users' freedom and community. Roughly, it means that the users have the freedom to run, copy, distribute, study, change and improve the software. Thus, “free software” is a matter of liberty, not price. To understand the concept, you should think of “free” as in “free speech,” not as in “free beer.”
https://www.gnu.org/philosophy/free-sw.html
Sometimes beer happens to be free, in which case it is referred to as "free beer". It's just an example.
So it's no different than fully open sourced projects.
The latter is a genuine concern. Will it be maintained? I like LaTeX a lot, but would I want to maintain its internals? No. Could I? If I were paid handsomely, yes. Emphasis on handsomely.
Which leads to another worry: LaTeX itself may be OSS, but down the line it is possible that maintained forks will be controlled by big publishers paying maintainers to deal with the insanity of its internals. And we all know how lovely those publishers are (凸ಠ益ಠ)凸
Unless academia collapses.
Any future corrections, additions or other modifications are made to the source, not the generated old pdf.
I don't know how many packages there are for working with tables, but 20 years ago, `tabu` was the most recommended package, until the maintainer stopped responding. Now the package is incompatible with almost everything else, leading to headaches when trying to compile old documents:
https://github.com/tabu-issues-for-future-maintainer/tabu
https://tex.stackexchange.com/questions/470107/incompatibili...
Typst at least has dependency pinning out of the box. If you value reproducibility, you should invent a similar mechanism for your LaTeX documents.
Also, I'm loosely following the activities around LaTeX on Github and Stackexchange and it seems that it's mostly maintained by three people or so (Carlisle, Mittelbach, Fischer), who - no offense - aren't getting any younger. I wonder how well LaTeX will be maintained if these long time contributors have to step down eventually.
- custom spacing - horizontal/vertical (after list dots/numbers etc as well)
- justifying the text
- custom margins
And for that something like this is an overkill. Less than Tex but still an overkill.
Hell, I can do with a flavoured markdown if it supports this.
Not everyone is into nostalgia. I don't try to take away LaTeX or vim from anyone, it just not for everyone.
1. It doesn't generate 5 bloody files when compiling.
2. Compiling is instant.
3. Diagnostics are way easier to understand (sort of like Rust compiler suggestion style).
4. List items can be either - item1 - item2, etc. or [item1], [item2]. The latter is way better because you can use anchoring to match on the braces (like "%" in vim), which means navigating long item entries is much easier.
5. In latex you have the \document{...} where you can't specify macros so they need to be at the top, in Typst you can specify the macros close to where you need them.
6. It's easier to version control and diff, especially if you use semantic line breaks.
7. Changing page layout, margins, spacing between things, etc., footers with page counters, etc. just seems way easier to do.
You can define macros anywhere in a LaTeX document; it's packages that need to be loaded before \begin{document}.
> 6. It's easier to version control and diff, especially if you use semantic line breaks.
TeX mostly ignores whitespace, so semantic line breaks and version control should work equally well with both LaTeX and Typst.
(I agree with all your other points though)
https://tex.stackexchange.com/questions/7453/what-is-the-use...
I'm not a vim user but my understanding is that it has native Unicode support. Software with old-school UI but adapted to current needs (or where needs just didn't change) is fine, but it's not the case of LaTeX.
This is the same reason why it isn't viable for me to switch to typst either, by the way. I hope it gains popularity and ends up as a standard displacing (or along with) pdflatex.
"LaTeX is not a word processor! Instead, LaTeX encourages authors not to worry too much about the appearance of their documents but to concentrate on getting the right content."
IMO, the only people that use LaTeX are people who are willing to trade the convenience and productivity of using a sane document authoring format for the warm and fuzzy feeling you get when you use an outdated piece of typesetting software that is a) hard to configure, b) hard to use and c) produces output for the least useful reading platform available (paged pdfs).
And the pronounciation is stupid.
I hope you are aware that literally all research in mathematics and computer science is typed up and published in LaTeX?
(No shade on people who do decide to use alternatives, and Typst is great!)
The other place it's useful is heavily typeset documents, especially those subject to somewhat frequent modification, like a resume.
Using a plain-text format like Markdown, ReStructuredText or AsciiDoc is likely better in almost all cases.
Can Typst provide better register-true layout? That would be interesting to me.
That is common because they are following the rules about how to steer capitalisation when using bib(la)tex:
- If the entry is in English, and the style demands title case, output as is
- If the entry is in English, and the style demands sentence case, convert to sentence-case and output
- If the entry is not in English, output as is
Nope: not possible to automatically determine which capitalised nouns are proper (and thus remain capitalised in sentence case) and which are common (and thus become uncapitalised).
This is in fact why it is better to store sentence case: it can be unambiguously converted to title case while the reverse is ambiguous. It’s not mere preference.
I have to agree that Typst source generally looks a lot less uglier than LaTeX. I considered writing stuff in Typst many times, but I couldn't master the courage to do so.
It may be stupid and vain but for me if it doesn’t at least match the former it’s a no-go
Just try it out. It is free, open source and very easy to setup. Just install the extension Tinymist on VSCode, that is all you need.
As of this version, it would be very hard to tell a difference in my experience
But it seems like LaTeX is the kind of thing that LLMs would nail perfectly. I feel like using it today wouldn't be very bad.
But as soon as someone starts talking about LaTEX and how they spent months on their macros, I think “another hapless victim has fallen into LaTEX’s trap.” It’s like an ant lion that feeds on procrastinating students.
And when your life is revolving around classes or your thesis, the #1 most important thing to you in the world is how easily you can transfer your ideas to paper/digital format. It makes a lot of sense that people care a lot about the quality of their typesetting engine and exchange macro tips with each other (I got a lot of helpful advice from friends, and my default latex header was about 50% my own stuff and 50% copied from friends in my same major)
I bet he could have done something more advanced if he had modern computers, but looking at it 75 years later and seeing his handwriting on the page was moving more than the content itself.
It produces documents that look like those produced by professors, and luminaries in the field. If you write equations in Word Equation Editor, your work just doesn't look very serious.
It's the same joy I felt when I laser-printed my first newsletter designed in Aldus PageMaker. I was only in my teens but I felt like a "professional".
Haven't tried it in a while, but, last I checked, Word Equation Editor output didn't look serious because it looked janky and look like it wasn't really done in a "professional" tool. Part of that is a self-fulfilling prophecy of course, LaTeX output looks right in part because it's what people have been reading for decades, but TeX's formulas just look plain good.
I would be willing to try again, but I'm not buying Word for the privilege.
She not only instantly recognized it, but, judging by the look and the platitudes she gave me on the spot, it probably earned me an extra point on the overall grade.
When in Rome...
The experience is also awful. It's much better to write \in or \frac{}{} rather than to go to a dropdown menu and figure out which button to click.
Sez you. MS Word 4.0 for Mac was perfectly alright, putting in less elbow grease than fiddling with LaTex.
And you could get a PDF out of it, via the PostScript print driver.
Never liked those spindly CM Tex fonts, anyway.
It’s a dumb filter anyway.
I was there once. In hindsight all the tweaks were a complete waste of time. All I needed was amsart, plus beamer for slides.
Another ergonomic benefit is scripting. For example, if I'm running a series of scripts to generate figures/plots, LaTeX will pick up on the new files (if the filename is unmodified) and update those figures after recompiling. This is preferable to scrolling through a large document in MS Word and attempting to update each figure individually.
As the size and figure count of your document increases, the ergonomics in MS Word degrade. The initial setup effort in LaTeX becomes minimal as this cost is "amortized" over the document.
I'm still sour about the 3 days it took me to have something usable for my thesis, and I was starting from an existing template. And it's still not exactly how I want it to be; I gave up on addressing a bug in the reference list.
Meanwhile, when I had a decent setup I could move a whole section from the intro to the results and the overall layout didn't suffer (floating tables, figures and code still in place, references still pointing where they should). I had code snippets with colour highlights imported from the actual source code (good luck trying that in Word). I could insert the companion papers with a single line of code per document, and they looked great. I even had a compilation flag to output the ereader version.
My take was that Word enabled my team mate to kick a lot of cans down the road (but the cans eventually came back), while for me the reverse was true: build a decent foundation, and after that it was all pure write-cite-compile.
Obvious reasons:
- Your thesis is a major output of years of work. Of course you want it to look good.
- You might think it superficial, but if the presentation looks bad, many people (subconsciously) interpret this as a lack of care and attention. Just like an email with typos feels unprofessional even if the content is otherwise fine.
- Spending time on tooling feels productive even if it is not past a certain point.
- People that are into typesetting now have an excuse to spend time on it.
That said, in my experience people spent a few hours to learn "enough" latex several years ago and almost never write any macros. Simple reason: you work with other people and different journal templates, so the less custom code the better.
This is all to say, if you're working on a theis or even a moderately large assignment, working in Word was not good for the nerves.
Looking back, I probably should have just worked in plain text and then worried about formatting only at the very end, but ummm, yes, I guess another hapless victim did indeed fall into LaTeX's trap. :)
The new versions at least serialise to some kind of monstrous XML representation of Word's internal state, so while it's not going to win any awards for world's most elegant document format, it should be slightly harder to corrupt in subtle ways.
Sure, theoretically you can only concentrate on writing with word and ignore layout. In practice in takes a lot of discipline so instead you see people moving figures around putting spaces or returns to move a heading where they want to etc.. In particular as a way to procrastinate from actual writing.
I theory, yes. And that's also what I'm usually trying to do.
What I have observed though with Latex folks is that they type 3 words and then look at the preview or re-compile to see if it looks good.
I also basically read the right pane rendered output, but mostly as a "reading out what I've written and evaluating whether it sounds good" most of the time, not really messing with layouting (especially that LaTeX and Typst does that very well, I can be reasonably sure that my paragraphs will have decent hypens and such).
Typst is interesting, but it doesn't yet support all microtypography features provided by microtype. IMHO, those make a big difference.
Large swathes of mathematics, computer science, and physics involve notations and diagrams that are genuinely hard to typeset, and incredibly repetitive and hard to read if you don’t make heavy use of the macro system. Integrating some actual programming features could be a game changer.
LuaTeX already lets you embed Lua code and it is really good.
However, I do agree some usability improvements are needed.
Look at this thing: https://images.app.goo.gl/4WHN9Pqupxkk8Z3j7
And that’s before you get into stuff like categories of spans, etc.
It is what sets professional typography apart. Only Adobe InDesign provides a comparable implementation, tweaking all those details.
See https://en.wikipedia.org/wiki/Hz-program for a better explanation and an example.
IMHO, the difference is obvious and not minor. Without microtypography texts look ugly: https://upload.wikimedia.org/wikipedia/commons/0/03/Hz_Progr...
Which is to say, half of these things are pretty subjective.
TeXmacs claims to have implemented microtypography as well (https://www.texmacs.org/tmweb/home/news.en.html, as I am reading it, in the opening paragraph on version 2.1)
I'd also not overemphasize the significance of microtype features. They might help with narrow columns but on wider columnds the difference is very small and most people will never notice them at all.
Honestly I don't disagree with him, it looked far better in 'TeX. But that's probably a learnt preference.
In essence, it's culture.
I’ve found in the decades since then that my most productive co-authors have been the ones who don’t think about typesetting and just use the basics. The ones who obsess over things like tikz or fancy macros for things like source layout and such: they get annoying fast.
I mean it is one of the few packages that can actually manage to annoy LaTeX fans, which is really saying something.
That said, nobody makes you use TikZ, fire up Inkscape and do it wysiwyg.
Probably because Donald Knuth created TeX and Leslie Lamport created LaTeX.
Two of the greatest minds in Computer Science created the tools and used them to write papers and articles that are beautiful.
Elegant ideas presented beautifully make reading and writing papers a nicer experience.
Autocorrect incorrected it for me.
I am saving this entire sequence for later use.
In my master's there were like 30 pages of formulas, all interdependent. Typing/retyping these would take forever.
Also, something as simple as having per-chapter files or working with an acceptable editor also helps.
But once you are in the latex world you start noticing how much prettier things can be. And then you end up sinking another thousand hours to perfectly aligning the summations in your multi-line equations.
Latex' handling of floating figures and tables is also much better.
And of course math notation is much nicer to work with in LaTeX (IMO).
You can actually use LaTeX math notation in the equation editor in modern Word.
Same reason wantrepreneurs have a fascination with adding dark mode to their CSS. It feels productive while you avoid the real work.
Accessibility is just as important as “the real work”.
Usually the process for ordering books is that you send them a PDF with embedded fonts inside it, and it's made at the university's printing house. They will handle distribution etc. So you really, really want it to look right at the first go.
There's been some progress the past few years now where you get to preview the book somewhat, but one surefire way to get it right is to use something like LaTeX. It used to be one of few WYSIWYG solutions out there. And it used to be really hard to do certain required things in e.g. Word. For instance skipping some page numbering and doing others in roman numerals etc.
Some other comments are oriented around aesthetics ("taste") or the state of other tools (Word, etc.) which I understand but those issues are more personal.
I am biased however, as my thesis was written in LaTeX with all the plots regenerated at compile time from the raw data.
Because its target userbase is people who don’t give a single shit about typography.
Why does anyone care about typesetting? Probably because they spend a lot of time working with text and have therefore developed a level of taste.
Just because the bottom 80% of consumers have zero taste and will accept any slop you give them doesn't mean there isn't value in doing something only appreciated by the top 20%. In any field, not just typesetting. Most people have ~no refined endogenous preferences for food, art, music, etc.
A mountain hiker can wear whatever, but above a certain altitude something must be true of them (fit, trained well, holding various gear, has supplies, or is in a plane/heli and probably even better trained/equipped/fit).
I would hope that typesetting is just a qualia of an ordered mind not a goal of it.
You can choose to feel "humiliated", but the truth should be closer to that you may simply be inadequate in that regard.
I.e. it is not that using LaTeX (or even Typst) makes you a better person, just that certain types of people will tend to use tools, like mountain climbers likely use carabiners.
At least 1 [0], but that's obviously a rather special case.
Then you discover that is id beautiful. Honestly, even using base style sets you above the typesetting of books. With some extra tweaks, it is beautiful.
Did I spend a lot of time on LaTeX during my PhD. Sure! But (even counting in all masochism involved into dealing with LaTeX) I both cherish this time, and the results.
And publishing is the primary way academics communicate en large - it's kinda important to be able write your specific notation without resorting to drawing on paper.
E.g., guitarists who own 80 of the same guitar and spend many hours on the internet arguing about tiny variations in what Fender was doing in 1961. And then they put out a video and it turns out they can barely play guitar at all.
I wouldn't exactly criticize them for that choice, but it's definitely a choice. Or maybe not a conscious choice, because the road to improvement is hard, but the road to more gear is loaded with honeypots of dopamine.
Latex will be around for decades.
I suppose the issue is not new, many people didn't want to use new lanuages before because they couldn't copy snippets from the internet, but it was frustrating then too.
I’ve been coding since before the camel book was published: at that time it was basically ask Larry Wall on Usenet or a local bearded guru if you weren’t in a university setting and wanted to learn to code.
I can hand craft code in many a language; I can also do fine wooden joinery. When a project has value to me in the completion and hours to completion is my metric then a cnc machine or an llm is a great tool, and allows me to make things that aren’t “worth” hand coding.
When I want to work on a technical skill or just get in the flow I code by hand or use my wood tools. Upshot: different strokes for different folks.
I'm surprised to hear that—I've been using GitHub Copilot with ConTeXt [0] since 2021, and it mostly works fairly well. And ConTeXt is much more obscure than Typst (but also much older, so maybe that gives it an advantage?).
I want a flowed book layout (so we have a facing page with inner and outer margins.)
I am rendering chats in the main part of the page. Chats alternate left and right alignment so it looks a bit like a text conversation. For each chat I want to put metadata (reactions, sender, time) on the margin it is aligned to.
So For a left chat, on a left page, I want to use the left (outside) margin. A left chat right hand page the inside margin.
Two things I could not get sorted: first, perfect vertical alignment between the chat and metadata, ( I think this is possible but difficult) and a persnickety bug where the first chat on each page chooses the last page’s proper margin side.
Happy to pay for an answer - I did try to hire a typesetter for this as well.
They are very decent at inferring the context of stuff and will mark code, maths, titles so on farely decently. This lets you focus on the work of making it looks nice.
Markdown + pandoc is not the same: you need already basic formatting. You have to think about the formatting. For example, math formulas and code are gnarly to type, so I have snippets for them to avoid getting out of the flow.
But LLM don't need that, you can just dump your thoughts, and they format it.
I'm observing, not here to convince anyone. The last six months of my life have been turned upside down, trying to discover the right touch for working with AI on topological research and code. It's hard to find good advice. Like surfing, the hardest part is all these people on the beach whining how the waves are kind of rough.
AI can actually read SVG math diagrams better than most people. AI doesn't like reading LaTeX source any more than I do.
I get the journal argument, but really? Some thawed-out-of-a-glacier journal editors still insist on two column formats, as if anyone still prints to paper. I'm old enough to not care. I'm thinking of publishing my work as a silent animation, and only later reluctantly releasing my AI prompts in the form of Typst documentation for the code.
That's AI which must adapt, not humans. If AI can't adapt then it can't be considered intelligent.
> Some thawed-out-of-a-glacier journal editors still insist on two column formats, as if anyone still prints to paper.
Narrow text is easier to read because you don't have to travel kilometres with your eyes. I purposely shrink width of the browser when reading longer texts.
I've been wondering about this a lot lately, specifically if there's a way to optimise my writing for AI to return specific elements when it's scraped/summarised (or whatever).