871 private links
A comprehensive listing of internet search engines out there.
Each element can be styled with multiple box shadows. By programmatically controlling the positions and sizes of each individual box shadow, it's possible to make up a something like a pixel display.
Given a linked list which happens to sit on consecutive memory, traversing it can take advantage of L1 cache. However it's possible to squeeze more performance by hinting the branch predictor to allow speculative execution, resulting in better parallelism with cpu pipelining. This is a simple and interesting trick although I can't think of much practical uses except for specific scenarios.
TIL in sh, you can press C-w
to delete previous word and C-u
to delete the whole line.
An interactive introduction to four-color problem and zero-knowledge proof.
A refreshing viewpoint: when we use UDP as an unreliable protocol, we often actually wants its "timeliness" property, which is to say if we have to choose from dropping two versions of a data, we want to drop the old one. This is why real-time video streaming and gaming choose UDP. The datagram extension in QUIC offers a nice solution. Data are split into streams, within each stream data is ordered. Each stream has a priority attached that is used to determine which packet to drop.
However, how do you choose which to drop without having a bloating buffer that hurts latency? The author suggests using delay-based congestion control like BBR that uses network metrics to probe the bandwidth and RTT.
Analyze various information about the website from an URL. IP, WHOIS, TLS, Cookie, etc. Similar to VirusTotal scan.
The design of HTTP/3 by Daniel Stenberg.
The design of HTTP/2 by Daniel Stenberg.
I stole this technique from Lawrence Block's outstanding Telling Lies for Fun and Profit, a book about writing fiction. He suggests drafting a story the "natural" way, with the first chapter introducing the hero and the second getting the action going, then swapping the two chapters.
Brilliant!
Farside collects a category of alternative libre front-end proxies to popular services. It tests their availability automatically and only show the working ones.
Vulkan learning guide.
Detailed scan report for URL: domain/IP info, http transactions, links, javascript behavior analysis, etc.
VirusTotal has this public tool that shows detailed information about IP address/domain: historical Whois lookup and certificate log. Similar to crt.sh.
I always confuse the concept of ours and theirs. Every time I have to choose one I need to search which is which. Turns out I'm not alone! This website explains what each term refers to. TL;DR: the currently checked out branch is "ours" in merge, but "theirs" in rebase.
TIL the faintness sensation come from hyperventilation is a result of respiratory alkalosis, which is caused by reduced CO2 (acidic) level in blood. CO2 is equilibrated with HCO3- (basic) in blood as a pH buffering solution. The lung controls the amount of CO2 whereas the kidney controls the amount of HCO3-. So breathing rate is determined by the level of CO2. Although the CO2 exhaled contributes to the majority of weight loss in human, hyperventilation is not a feasible way to lose weight because the benefit is marginal compared to the downside of respiratory alkalosis. A better way to lose weight is through producing more CO2, which is achieved by increasing metabolism rate.
Pratt parsing is a parsing algorithm that solves the awkward handling of left-recursion in a recursive descent parser. It elegantly handles both operator precedence and associativity using a single "binding power" concept. The core algorithm is as simple as follows:
fn expr(input: &str) -> S {
let mut lexer = Lexer::new(input);
expr_bp(&mut lexer, 0)
}
fn expr_bp(lexer: &mut Lexer, min_bp: u8) -> S {
let mut lhs = match lexer.next() {
Token::Atom(it) => S::Atom(it),
t => panic!("bad token: {:?}", t),
};
loop {
let op = match lexer.peek() {
Token::Eof => break,
Token::Op(op) => op,
t => panic!("bad token: {:?}", t),
};
let (l_bp, r_bp) = infix_binding_power(op);
if l_bp < min_bp {
break;
}
lexer.next();
let rhs = expr_bp(lexer, r_bp);
lhs = S::Cons(op, vec![lhs, rhs]);
}
lhs
}
fn infix_binding_power(op: char) -> (u8, u8) {
match op {
'+' | '-' => (1, 2),
'*' | '/' => (3, 4),
_ => panic!("bad op: {:?}"),
}
}
Like the now shut down RawGit, it's a free CDN for serving raw files from GitHub and other platforms.
curl ifconfig.io
curl ifconfig.me
curl ifconfig.co
curl ip.sb
curl icanhazip.com
curl myip.wtf/text
curl geofind.me # has geolocation info
port reachability test:
ifconfig.co/port/22
Method: train a sparse autoencoder on the activation on the residual stream. The sparsely activated components ensure only few features are activated for similar activation patterns in residual stream. Each of the feature is in turn interpreted by an LLM for its semantics. One can use these feature to semantically interpret the working of the model and steer the model towards desired goals.