<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Notes · Pablo Stafforini</title><link>https://stafforini.com/notes/</link><description/><generator>Hugo -- gohugo.io</generator><language>en</language><lastBuildDate>Wed, 15 Apr 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://stafforini.com/notes/index.xml" rel="self" type="application/rss+xml"/><item><title>Situational Awareness LP</title><link>https://stafforini.com/notes/situational-awareness-lp/</link><pubDate>Fri, 27 Feb 2026 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/situational-awareness-lp/</guid><description>&lt;![CDATA[**Situational Awareness LP** is a hedge fund founded in 2024 by Leopold Aschenbrenner and co-managed by Carl Shulman. It is backed by Patrick &amp; John Collison, Daniel Gross, and Nat Friedman. The fund's thesis is explicitly AGI-focused, pursuing an opportunistic approach to public equities and strategic investments in semiconductor companies and energy infrastructure.
The copycat approach {#the-copycat-approach}
Investing directly in the fund requires a $25M minimum, a two-year lockup followed by quarterly redemptions over two more years, and Qualified Purchaser accreditation. The fund also has the standard "2 and 20" fee structure. If those barriers are too much for you, there is an alternative. Institutional investment managers with over $100M in qualifying assets must file Form 13F with the United States Securities and Exchange Commission (SEC) each quarter, disclosing their long equity positions, options, and convertible bonds. Filings are due 45 days after quarter-end and are published on the SEC's [EDGAR system](https://www.sec.gov/cgi-bin/browse-edgar?action=getcompany&CIK=0002045724&type=13F&dateb=&owner=include&count=40).
This approach has significant limitations, however: the copycat sees each portfolio only after it is disclosed, not when the fund actually trades into it;[^fn:1] 13F filings exclude short positions, foreign-listed securities, and non-equity assets; and, for options, filings identify the underlying security but omit strike prices, expirations, and premiums. Still, the loss of fidelity may be acceptable if one is sufficiently bullish on the fund's strategy and wants a simple way to get exposure to its public equity bets.
Backtesting the strategy {#backtesting-the-strategy}
A trader following this strategy would rebalance to each new 13F on its filing date — buying the disclosed portfolio at that day’s closing prices, weighting each position by its reported dollar value — and hold unchanged until the next filing. Backtesting the rule is then mostly mechanical: replay the rebalances across all available filings and compound the period returns.
One wrinkle complicates the replay. 13F filings report options only partially: they name the underlying security and give a dollar value, but omit strike, expiration, and premium. Since that is not enough information to reconstruct the actual option positions, the script below reports the backtest under two modes, labelled _equity proxy_ and _option proxy_. Both are explained in detail after the script. The final period runs from the most recent filing to today, so re-running the script updates the last row automatically.[^fn:2]<details><summary>Code</summary><div class="details"><a id="code-snippet--sa-data"/>
```python
── SA LP 13F data fetcher ─────────────────────────────────────────
Fetches all 13F-HR filings from SEC EDGAR for Situational Awareness LP,
parses the infotable XML, and resolves CUSIPs to tickers.
Output: JSON with one entry per quarterly filing, each containing
the filing date, quarter label, and a list of holdings.
Caches results in CACHE_DIR to avoid redundant SEC requests.
import urllib.request, re, json, sys, time, os, xml.etree.ElementTree as ET
from datetime import datetime
── Configuration (update these for your environment) ──────────────
SEC_UA = os.environ.get(
'SEC_USER_AGENT',
'stafforini.com situational-awareness-lp research; contact via stafforini.com')
CACHE_DIR = os.path.expanduser('~/.cache')
CIK = '2045724'
BASE = f'https://www.sec.gov/Archives/edgar/data/{CIK}'
NS = {'ns': 'http://www.sec.gov/edgar/document/thirteenf/informationtable'}
CACHE = os.path.join(CACHE_DIR, 'sa-lp-13f.json')
CUSIP_TICKER = {
'038169207': 'APLD', '05614L209': 'BW', '09173B107': 'BITF',
'093712107': 'BE', '093712AH0': 'BE', '11135F101': 'AVGO',
'12514G108': 'CIFR', '17253J106': 'CIFR', '17253JAA4': 'CIFR',
'18452B209': 'CLSK', '19247G107': 'COHR', '21037T109': 'CEG',
'21873S108': 'CRWV', '21874A106': 'CORZ', '26884L109': 'EQT',
'36168Q104': 'GLXY', '36317J209': 'GLXY', '44282L109': 'HUT',
'44812J104': 'HUT', '456788108': 'INFY', '458140100': 'INTC',
'49338L103': 'KRC', '49427F108': 'KRC', '53115L104': 'LBRT',
'55024U109': 'LITE', '55024UAD1': 'LITE', '573874104': 'MRVL',
'577933104': 'MRVL', '593787101': 'MU', '593787105': 'MU',
'595112103': 'MU', '607828100': 'MOD', '67066G104': 'NVDA',
'683344105': 'ONTO', '68340J108': 'ONTO', '73933G202': 'PSIX',
'73933H100': 'PSIX', '743344109': 'PUMP', '74347M108': 'PUMP',
'76754A103': 'RIOT', '767292105': 'RIOT', '80004C200': 'SNDK',
'80106M109': 'SNDK', '83418M103': 'SEI', '87422Q109': 'TLN',
'87425V106': 'TLN', '874039100': 'TSM', '89854H102': 'TSEM',
'92189F106': 'SMH', '92189F676': 'SMH', '92535P101': 'VRT',
'92537N108': 'VRT', '92840M102': 'VST', '958102105': 'WDC',
'958102AT2': 'WDC', '98321C108': 'WYFI',
'G1110V104': 'BITF', 'G1189L107': 'BTDR', 'G11448100': 'BTDR',
'G7945J104': 'STX', 'G7997R103': 'STX', 'G96115103': 'WYFI',
'M87915274': 'TSEM', 'Q4982L109': 'IREN',
}
def fetch(url, timeout=10):
time.sleep(0.5)
req = urllib.request.Request(url, headers={'User-Agent': SEC_UA})
with urllib.request.urlopen(req, timeout=timeout) as resp:
return resp.read()
def find_infotable_filename(acc):
"""Discover the infotable XML filename for a filing via EFTS, then -index.htm."""
EFTS search (fast, reliable)
try:
efts = f'https://efts.sec.gov/LATEST/search-index?q=%22{acc}%22'
data = json.loads(fetch(efts))
for hit in data.get('hits', {}).get('hits', []):
doc_id = hit['_id'] # format: "accession:filename"
filename = doc_id.split(':', 1)[1] if ':' in doc_id else ''
if filename.endswith('.xml') and 'primary_doc' not in filename:
return filename
except Exception:
pass
Fallback: filing index page
acc_path = acc.replace('-', '')
try:
html = fetch(f'{BASE}/{acc_path}/{acc}-index.htm').decode()
for href in re.findall(r'href="([^"]*\.xml)"', html):
fn = href.split('/')[-1]
if fn != 'primary_doc.xml' and 'xslForm' not in href:
return fn
except Exception:
pass
return None
def parse_infotable(xml_data):
root = ET.fromstring(xml_data)
holdings = []
for info in root.findall('.//ns:infoTable', NS):
cusip = info.findtext('ns:cusip', '', NS).strip()
value = int(info.findtext('ns:value', '0', NS))
putcall = info.findtext('ns:putCall', '', NS).strip().lower()
ticker = CUSIP_TICKER.get(cusip, '')
if not ticker:
issuer = info.findtext('ns:nameOfIssuer', '', NS)
print(f"WARNING: Unknown CUSIP {cusip} ({issuer})", file=sys.stderr)
ticker = f"UNKNOWN_{cusip}"
pos_type = 'put' if putcall == 'put' else 'call' if putcall == 'call' else 'long'
holdings.append({"ticker": ticker, "type": pos_type, "value": value})
return holdings
def quarter_from_filing_date(fdate):
d = datetime.strptime(fdate, '%Y-%m-%d')
m, y = d.month, d.year
if m<= 3:= return= f'Q4_{y-1}',= f'{y-1}-12-31'= elif= m= <=6: return= f'Q1_{y}',= f'{y}-03-31'= elif= m= <=9: return= f'Q2_{y}',= f'{y}-06-30'= else:= return= f'Q3_{y}',= f'{y}-09-30'= def= load_cache():= if= os.path.exists(CACHE):= with= open(CACHE)= as= f:= return= json.load(f)= return= {"filings":= []}= def= save_cache(data):= os.makedirs(os.path.dirname(CACHE),= exist_ok=True) with= open(CACHE,= 'w')= as= f:= json.dump(data,= f)= cached=load_cache() cached_quarters={f["quarter"] for= f= in= cached["filings"]}= result={"filings": list(cached["filings"])}= try:= subs_url=f'https://data.sec.gov/submissions/CIK{CIK.zfill(10)}.json' subs=json.loads(fetch(subs_url)) recent=subs['filings']['recent'] accessions=sorted( [(recent['filingDate'][i],= recent['accessionNumber'][i])= for= i= in= range(len(recent['form']))= if= recent['form'][i]== '13F-HR'])= for= fdate,= acc= in= accessions:= quarter,= quarter_end=quarter_from_filing_date(fdate) if= quarter= in= cached_quarters:= continue= filename=find_infotable_filename(acc) if= not= filename:= print(f"Could= not= find= infotable= for= {acc}",= file=sys.stderr) continue= acc_path=acc.replace('-', '')= xml=fetch(f'{BASE}/{acc_path}/{filename}') holdings=parse_infotable(xml) result["filings"].append({= "quarter":= quarter,= "quarter_end":= quarter_end,= "filing_date":= fdate,= "holdings":= holdings})= result["filings"].sort(key=lambda f:= f["filing_date"])= save_cache(result)= except= Exception= as= e:= if= result["filings"]:= print(f"SEC= fetch= error= ({e});= using= cache",= file=sys.stderr) else:= raise= return= json.dumps(result)= ```= <a= id="code-snippet--sa-perf"/>
```python
import json
import yfinance as yf
import pandas as pd
from datetime import datetime, timedelta
import numpy as np
import requests
import time
import os
import warnings
warnings.filterwarnings('ignore')
Parse data from the scraper block
parsed = json.loads(data) if isinstance(data, str) else data
filings = parsed["filings"]
Build internal structures
filing_dates = {f["quarter"]: f["filing_date"] for f in filings}
quarter_end_dates = {f["quarter"]: f["quarter_end"] for f in filings}
quarters = [f["quarter"] for f in filings]
Convert holdings list to dict keyed by quarter.
Multiple positions in the same ticker with different types are aggregated
by value per (ticker, type) pair.
holdings = {}
for f in filings:
positions = {}
for h in f["holdings"]:
ticker = h["ticker"]
pos_type = h["type"]
value = h["value"]
key = (ticker, pos_type)
positions[key] = positions.get(key, 0) + value
holdings[f["quarter"]] = positions
def _extract_close_series(df, ticker):
"""Extract a single close-price series from a yfinance result."""
if df.empty:
return pd.Series(dtype=float)
if isinstance(df.columns, pd.MultiIndex):
if 'Close' not in df.columns.get_level_values(0):
return pd.Series(dtype=float)
close = df['Close']
if isinstance(close, pd.DataFrame):
if ticker in close.columns:
series = close[ticker]
elif len(close.columns) == 1:
series = close.iloc[:, 0]
else:
return pd.Series(dtype=float)
else:
series = close
elif 'Close' in df.columns:
series = df['Close']
if isinstance(series, pd.DataFrame):
series = series.iloc[:, 0]
else:
return pd.Series(dtype=float)
return pd.to_numeric(series, errors='coerce').dropna()
def _download_close_series(ticker, start, end):
"""Download one ticker's close series; used to repair flaky batch misses."""
df = yf.download(ticker, start=start, end=end, progress=False,
auto_adjust=True)
return _extract_close_series(df, ticker)
def get_prices(tickers, dates):
"""Fetch close prices for tickers on specific dates."""
unique_tickers = sorted(set(tickers))
all_dates = [datetime.strptime(d, '%Y-%m-%d') for d in dates]
start = min(all_dates) - timedelta(days=5)
end = max(all_dates) + timedelta(days=5)
df = yf.download(unique_tickers, start=start, end=end, progress=False, auto_adjust=True)
yf.download returns MultiIndex columns (metric, ticker) for multiple tickers
if df.empty:
close = pd.DataFrame()
elif isinstance(df.columns, pd.MultiIndex) and 'Close' in df.columns.get_level_values(0):
close = df['Close'].copy()
elif 'Close' in df.columns:
close = df[['Close']].copy()
close.columns = unique_tickers
else:
close = pd.DataFrame()
prices = {}
for ticker in unique_tickers:
if ticker in close.columns:
series = pd.to_numeric(close[ticker], errors='coerce').dropna()
else:
series = pd.Series(dtype=float)
if series.empty:
series = _download_close_series(ticker, start, end)
if series.empty:
continue
prices[ticker] = {}
for date_str in dates:
target = pd.Timestamp(datetime.strptime(date_str, '%Y-%m-%d'))
after = series[series.index >= target]
if not after.empty:
prices[ticker][date_str] = float(after.iloc[0])
else:
before = series[series.index<= target]= if= not= before.empty:= prices[ticker][date_str]=float(before.iloc[-1]) return= prices= def= _price_on_or_after(px_by_date,= target_date):= """Return= (date,= price)= for= the= first= available= price= on/after= target."""= if= not= px_by_date:= return= None= dates=sorted(d for= d= in= px_by_date= if= d=>= target_date)
if not dates:
return None
d = dates[0]
return d, px_by_date[d]
def _price_on_or_before(px_by_date, target_date):
"""Return (date, price) for the last available price on/before target."""
if not px_by_date:
return None
dates = sorted(d for d in px_by_date if d<= target_date)= if= not= dates:= return= None= d=dates[-1] return= d,= px_by_date[d]= def= _period_price_pair(px_by_date,= start_date,= end_date):= """Return= start/end= prices= for= a= period= using= sensible= boundary= alignment."""= start=_price_on_or_after(px_by_date, start_date)= end=_price_on_or_before(px_by_date, end_date)= if= start= is= None= or= end= is= None:= return= None= start_actual,= p0=start end_actual,= p1=end if= end_actual= <= start_actual:= return= None= return= start_actual,= end_actual,= p0,= p1= def= _option_position_key(ticker,= pos_type):= return= (ticker,= pos_type)= def= _linear_underlying_sign(pos_type):= """Direction= when= option= rows= are= converted= to= underlying= equity= exposure."""= return= -1= if= pos_type== 'put'= else= 1= def= compute_return(positions,= prices,= start_date,= end_date,= mode='equity_only' ,= option_prices=None): """Compute= portfolio= return= between= two= dates.= The= 13F= value= for= an= option= row= is= treated= as= underlying= notional,= not= option= premium.= Option= contracts= are= sized= from= that= notional,= but= the= portfolio= denominator= is= estimated= deployed= capital:= stock= value= plus= option= premium= cost.= This= avoids= treating= the= gap= between= option= notional= and= option= premium= as= cash.= In= 'full'= mode,= every= option= row= requires= a= MarketData= price= series;= missing= data= raises= rather= than= falling= back.= """= total_cost=0 portfolio_pnl=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= stock_px=prices.get(ticker) if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= total_cost= +=value portfolio_pnl= +=value *= _linear_underlying_sign(pos_type)= *= stock_ret= continue= if= is_option:= opt_key=_option_position_key(ticker, pos_type)= opt_px=option_prices.get(opt_key) if= option_prices= else= None= if= not= opt_px:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= period= "= f"{start_date}..{end_date}")= pair=_period_price_pair(opt_px, start_date,= end_date)= if= pair= is= None:= raise= RuntimeError(= f"MarketData= option= price= series= for= {opt_key}= does= not= "= f"cover= {start_date}..{end_date}")= start_actual,= end_actual,= opt_p0,= opt_p1=pair stock_start=_price_on_or_after(stock_px, start_actual)= if= stock_start= is= None= or= stock_start[1]= <=0: stock_start=_price_on_or_after(stock_px, start_date)= if= stock_start= is= None= or= stock_start[1]= <=0: raise= RuntimeError(= f"No= underlying= price= for= {ticker}= at= {start_date}")= p0,= p1=opt_p0, opt_p1= underlying_p0=stock_start[1] if= p0= <=0 or= underlying_p0= <=0: continue= position_cost=value *= (p0= /= underlying_p0)= position_pnl=value *= ((p1= -= p0)= /= underlying_p0)= else:= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= position_cost=value position_pnl=value *= stock_ret= if= position_cost= <=0: continue= total_cost= +=position_cost portfolio_pnl= +=position_pnl return= portfolio_pnl= /= total_cost= if= total_cost= else= None= def= annualize(ret,= days):= """Annualize= a= return= over= a= given= number= of= calendar= days."""= if= ret= is= None= or= days= <=0: return= None= return= (1= += ret)= **= (365.25= /= days)= -= 1= def= fmt(ret):= return= f"{ret= *= 100:+.2f}%"= if= ret= is= not= None= else= "N/A"= Collect= all= tickers= and= dates= all_tickers=set() for= positions= in= holdings.values():= for= (ticker,= _)= in= positions:= all_tickers.add(ticker)= all_tickers.add('SPY')= today=datetime.now().strftime('%Y-%m-%d') first_date=filing_dates[quarters[0]] all_dates=set(filing_dates.values()) |= set(quarter_end_dates.values())= |= {today}= prices=get_prices(sorted(all_tickers), sorted(all_dates))= Resolve= `today`= to= the= actual= last= available= closing= date.= yfinance= may= not= have= data= for= today= (market= still= open= or= holiday),= so= we= look= up= what= date= SPY's= price= actually= corresponds= to.= def= _resolve_price_date(prices,= requested_date):= """Return= the= actual= trading= date= of= the= price= stored= under= requested_date."""= ref='SPY' if= 'SPY'= in= prices= else= next(iter(prices),= None)= if= not= ref= or= requested_date= not= in= prices[ref]:= return= requested_date= target_price=prices[ref][requested_date] Re-download= a= small= window= to= find= the= real= date= of= this= price= start=datetime.strptime(requested_date, '%Y-%m-%d')= -= timedelta(days=10) end=datetime.strptime(requested_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(ref, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= return= requested_date= if= isinstance(df.columns,= pd.MultiIndex):= close=df['Close'][ref].dropna() elif= 'Close'= in= df.columns:= close=df['Close'].dropna() else:= close=df.iloc[:, 0].dropna()= for= dt,= px= in= close.items():= val=float(px.iloc[0]) if= isinstance(px,= pd.Series)= else= float(px)= if= abs(val= -= target_price)= <= 0.01:= ts=dt[0] if= isinstance(dt,= tuple)= else= dt= return= pd.Timestamp(ts).strftime('%Y-%m-%d')= return= requested_date= today_resolved=_resolve_price_date(prices, today)= if= today_resolved= !=today: for= ticker= in= prices:= if= today= in= prices[ticker]:= prices[ticker][today_resolved]=prices[ticker].pop(today) today=today_resolved def= download_daily(tickers,= start_date,= end_date):= """Download= daily= close= prices= from= yfinance,= handling= MultiIndex.= Dates= are= 'YYYY-MM-DD'= strings.= Adds= a= small= buffer= for= trading-day= alignment."""= tickers_sorted=sorted(tickers) start=datetime.strptime(start_date, '%Y-%m-%d')= -= timedelta(days=5) end=datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(tickers_sorted, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= close=pd.DataFrame() elif= isinstance(df.columns,= pd.MultiIndex)= and= 'Close'= in= df.columns.get_level_values(0):= close=df['Close'].copy() elif= 'Close'= in= df.columns:= close=df[['Close']].copy() close.columns=tickers_sorted else:= close=pd.DataFrame() for= ticker= in= tickers_sorted:= if= ticker= in= close.columns= and= not= close[ticker].dropna().empty:= continue= series=_download_close_series(ticker, start,= end)= if= not= series.empty:= close[ticker]=series return= close.sort_index()= --= Historical= option= prices= via= MarketData= --------------------------------= OPTION_CACHE_DIR=os.path.expanduser('~/My Drive/notes/.sa-lp-option-cache')= _MD_BASE='https://api.marketdata.app/v1' _MD_RATE_DELAY=0.15 OPTION_CACHE_COLUMNS=[ 'date',= 'selected_on',= 'option_type',= 'symbol',= 'strike',= 'expiry',= 'delta',= 'price']= Default= contract= selection= parameters.= The= option= proxy= picks= a= contract= matching= option= type,= with= expiry= between= min_days= and= max_days= of= the= period= start,= and= |delta|= closest= to= delta_target.= When= the= chain= is= sparse,= the= achieved= |delta|= may= be= far= from= the= target;= the= sensitivity= block= reports= achieved= |delta|= so= this= is= visible= rather= than= silent.= OPTION_DELTA=0.15 EXPIRY_MIN_DAYS=270 #= ~9= months= EXPIRY_MAX_DAYS=456 #= ~15= months= def= _normalize_option_type(option_type):= option_type=str(option_type).lower() if= option_type= not= in= ('call',= 'put'):= raise= ValueError(f"Unsupported= option= type:= {option_type}")= return= option_type= def= _empty_option_cache():= return= pd.DataFrame(columns=OPTION_CACHE_COLUMNS) def= _option_cache_path(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Return= the= cache= CSV= path= for= (ticker,= type,= delta_target,= window).= When= the= parameter= triple= equals= the= baseline= (0.15,= 270-456= days),= the= historical= filename= ``TICKER-TYPE.csv``= is= used= so= the= main-backtest= cache= is= reused= automatically.= Any= non-baseline= combo= lives= in= a= separate= ``TICKER-TYPE-d<delta=>-e<min>-<max>.csv`` file so a sensitivity
sweep never pollutes the baseline cache (which the portfolio calculator
reads to pick the representative contract for the current filing).
"""
option_type = _normalize_option_type(option_type)
is_baseline = (
abs(delta_target - OPTION_DELTA)< 1e-9= and= min_days== EXPIRY_MIN_DAYS= and= max_days== EXPIRY_MAX_DAYS)= if= is_baseline:= return= os.path.join(OPTION_CACHE_DIR,= f'{ticker}-{option_type}.csv')= return= os.path.join(= OPTION_CACHE_DIR,= f'{ticker}-{option_type}-d{delta_target:g}-e{min_days}-{max_days}.csv')= def= _load_option_cache(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Load= cached= MarketData= rows= for= a= ticker/type/target/window.= Returns= DataFrame= or= empty."""= option_type=_normalize_option_type(option_type) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= not= os.path.exists(path):= return= _empty_option_cache()= df=pd.read_csv(path) if= df.empty:= return= _empty_option_cache()= for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan for= col= in= ('date',= 'selected_on'):= df[col]=pd.to_datetime( df[col],= errors='coerce' ).dt.strftime('%Y-%m-%d')= df['option_type']=df['option_type'].fillna(option_type).str.lower() cache=df[OPTION_CACHE_COLUMNS].copy() cache=cache[cache['option_type'] == option_type].copy()= cache.dropna(subset=['date'], inplace=True) for= col= in= ('strike',= 'delta',= 'price'):= cache[col]=pd.to_numeric(cache[col], errors='coerce' )= cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) return= cache[OPTION_CACHE_COLUMNS]= def= _save_option_cache(ticker,= option_type,= df,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Persist= typed= option= cache= to= CSV."""= option_type=_normalize_option_type(option_type) os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= df.empty:= df=_empty_option_cache() else:= df=df.copy() df['option_type']=option_type for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan df.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) df.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) df.to_csv(path,= index=False) def= _contract_window(ref_date_str,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= return= ref= += timedelta(days=min_days), ref= += timedelta(days=max_days) def= _contract_from_cache_row(row,= ref_date_str,= option_type,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) if= str(row.get('option_type',= option_type)).lower()= !=option_type: return= None= lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= try:= exp=datetime.strptime(str(row['expiry']), '%Y-%m-%d')= except= (KeyError,= TypeError,= ValueError):= return= None= if= not= (lo= <=exp <=hi): return= None= strike=_safe_float(row.get('strike')) delta=_safe_float(row.get('delta')) price=_safe_float(row.get('price')) if= strike= is= None= or= delta= is= None= or= price= is= None= or= price= <=0: return= None= return= {= 'selected_on':= row.get('selected_on'),= 'option_type':= option_type,= 'symbol':= row.get('symbol'),= 'strike':= strike,= 'expiry':= str(row['expiry']),= 'delta':= delta,= 'price':= price,= }= def= _select_cached_contract(cache,= option_type,= ref_date_str,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS, require_selected=False): rows=cache[(cache['date'] == ref_date_str)= &= (cache['option_type']== option_type)]= selected_rows=rows[rows['selected_on'] == ref_date_str]= if= not= selected_rows.empty:= rows=selected_rows elif= require_selected:= rows=selected_rows candidates=[] for= _,= row= in= rows.iterrows():= contract=_contract_from_cache_row(row, ref_date_str,= option_type,= min_days,= max_days)= if= contract:= candidates.append(contract)= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= _parse_option_price(contract):= """Extract= a= mark= price= from= an= option= contract= record."""= mid=_safe_float(contract.get('mid')) if= mid= and= mid=> 0:
return mid
bid = _safe_float(contract.get('bid'))
ask = _safe_float(contract.get('ask'))
last = _safe_float(contract.get('last'))
if bid and ask and bid > 0 and ask > 0:
return (bid + ask) / 2
if last and last > 0:
return last
return None
def _safe_float(val):
try:
out = float(val)
if np.isnan(out):
return None
return out
except (TypeError, ValueError):
return None
def _marketdata_key():
"""Return the MarketData API key, or None if unavailable.
Resolution order:
1. ``MARKETDATA_KEY`` / ``MARKETDATA_API_KEY`` environment variables.
2. ``pass env/marketdata-token`` (local ``pass`` store).
The result is memoised on the function object so repeated lookups
during a sweep do not reshell. Fetch helpers raise themselves when
called without a key, so a fully cached run still succeeds without
requiring either source.
"""
if hasattr(_marketdata_key, '_cached'):
return _marketdata_key._cached
key = (os.environ.get('MARKETDATA_KEY', '')
or os.environ.get('MARKETDATA_API_KEY', ''))
if not key:
try:
import subprocess
out = subprocess.run(
['pass', 'show', 'env/marketdata-token'],
capture_output=True, text=True, timeout=5, check=False)
if out.returncode == 0:
key = out.stdout.strip().splitlines()[0] if out.stdout else ''
except (FileNotFoundError, subprocess.TimeoutExpired):
key = ''
_marketdata_key._cached = key or None
return _marketdata_key._cached
def _marketdata_get(path, params, api_key):
"""Fetch a MarketData endpoint, returning normalized row dictionaries.
Raises on HTTP errors or a non-'ok' status. 'no_data' is returned as
an empty list so that callers can distinguish 'nothing available' from
'request failed'.
"""
headers = {'Accept': 'application/json', 'Authorization': f'Bearer {api_key}'}
resp = requests.get(_MD_BASE + path, params=params, headers=headers,
timeout=30)
resp.raise_for_status()
body = resp.json()
status = body.get('s')
if status == 'no_data':
return []
if status != 'ok':
raise RuntimeError(
f"MarketData {path} returned status={status!r}: "
f"{body.get('errmsg') or body}")
lengths = [len(v) for v in body.values() if isinstance(v, list)]
n = max(lengths) if lengths else 0
rows = []
for i in range(n):
row = {}
for key, val in body.items():
if isinstance(val, list):
row[key] = val[i] if i< len(val)= else= None= else:= row[key]=val rows.append(row)= return= rows= def= _marketdata_date(timestamp):= try:= return= datetime.utcfromtimestamp(int(timestamp)).strftime('%Y-%m-%d')= except= (TypeError,= ValueError,= OSError):= return= None= def= _occ_symbol(ticker,= option_type,= strike,= expiry):= """Build= a= standard= OCC= option= symbol= from= contract= fields."""= cp='C' if= _normalize_option_type(option_type)== 'call'= else= 'P'= exp=datetime.strptime(str(expiry), '%Y-%m-%d').strftime('%y%m%d')= strike_int=int(round(float(strike) *= 1000))= root=ticker.upper().replace('.', '')= return= f'{root}{exp}{cp}{strike_int:08d}'= Chains= are= always= fetched= with= a= broad= expiry= window= so= they= can= be= cached= and= reused= for= in-memory= selection= across= any= (delta_target,= expiry= window)= combination= in= the= sensitivity= sweep.= CHAIN_FETCH_MIN_DAYS=30 CHAIN_FETCH_MAX_DAYS=760 def= _fetch_marketdata_chain(ticker,= date_str,= option_type,= api_key,= min_days=CHAIN_FETCH_MIN_DAYS, max_days=CHAIN_FETCH_MAX_DAYS): lo,= hi=_contract_window(date_str, min_days,= max_days)= params={ 'date':= date_str,= 'from':= lo.strftime('%Y-%m-%d'),= 'to':= hi.strftime('%Y-%m-%d'),= 'side':= _normalize_option_type(option_type),= 'expiration':= 'all',= }= return= _marketdata_get(f'/options/chain/{ticker}/',= params,= api_key)= Chain= cache:= one= CSV= per= (ticker,= type,= date)= storing= the= broad-window= chain.= Lets= the= sensitivity= sweep= re-select= contracts= for= different= delta= targets= and= expiry= windows= without= refetching.= CHAIN_CACHE_DIR=os.path.join(OPTION_CACHE_DIR, 'chains')= def= _chain_cache_path(ticker,= option_type,= date_str):= option_type=_normalize_option_type(option_type) return= os.path.join(CHAIN_CACHE_DIR,= f'{ticker}-{option_type}-{date_str}.csv')= def= _load_chain_cache(ticker,= option_type,= date_str):= path=_chain_cache_path(ticker, option_type,= date_str)= if= not= os.path.exists(path):= return= None= df=pd.read_csv(path) if= df.empty:= return= []= return= df.to_dict('records')= def= _save_chain_cache(ticker,= option_type,= date_str,= chain):= if= not= chain:= return= os.makedirs(CHAIN_CACHE_DIR,= exist_ok=True) path=_chain_cache_path(ticker, option_type,= date_str)= pd.DataFrame(chain).to_csv(path,= index=False) def= _get_or_fetch_chain(ticker,= date_str,= option_type,= api_key,= fetched_counter=None): """Return= the= cached= broad= chain= for= (ticker,= type,= date),= fetching= if= absent.= Requires= ``api_key``= only= when= a= fetch= is= actually= needed.= """= chain=_load_chain_cache(ticker, option_type,= date_str)= if= chain= is= not= None:= return= chain= if= not= api_key:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= but= a= chain= fetch= is= required= for= "= f"{ticker}= {option_type}= on= {date_str}.")= time.sleep(_MD_RATE_DELAY)= chain=_fetch_marketdata_chain(ticker, date_str,= option_type,= api_key)= if= fetched_counter= is= not= None:= fetched_counter['marketdata_chains']= +=1 _save_chain_cache(ticker,= option_type,= date_str,= chain)= return= chain= def= _fetch_marketdata_quotes(symbol,= start_date,= end_date,= api_key):= to_date=(datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=1)).strftime('%Y-%m-%d') rows=_marketdata_get(f'/options/quotes/{symbol}/', {'from':= start_date,= 'to':= to_date},= api_key)= prices={} for= row= in= rows:= date_str=_marketdata_date(row.get('updated')) if= not= date_str:= continue= price=_parse_option_price(row) if= price= is= not= None= and= price=> 0:
prices[date_str] = price
return prices
def _implied_vol_from_price(S, K, T, option_price, option_type):
"""Infer Black-Scholes volatility from an observed option mid price."""
if any(x is None for x in (S, K, T, option_price)):
return None
if S<= 0= or= K= <=0 or= T= <=0 or= option_price= <=0: return= None= intrinsic=max(S -= K,= 0)= if= option_type== 'call'= else= max(K= -= S,= 0)= upper=S if= option_type== 'call'= else= K= if= option_price= <= intrinsic= -= 1e-6= or= option_price=> upper * 1.5:
return None
lo, hi = 1e-4, 5.0
try:
if (option_price< bs_price(S,= K,= T,= lo,= option_type)= -= 1e-4= or= option_price=> bs_price(S, K, T, hi, option_type) + 1e-4):
return None
for _ in range(80):
mid = (lo + hi) / 2
if bs_price(S, K, T, mid, option_type)< option_price:= lo=mid else:= hi=mid return= (lo= += hi)= /= 2= except= (FloatingPointError,= ValueError,= ZeroDivisionError):= return= None= def= _marketdata_delta(row,= ref_date_str,= expiry,= option_type,= price):= """Use= vendor= delta= when= present;= otherwise= infer= it= from= the= quote."""= native=_safe_float(row.get('delta')) if= native= is= not= None= and= native= !=0: return= native= S=_safe_float(row.get('underlyingPrice')) K=_safe_float(row.get('strike')) ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= exp=datetime.strptime(expiry, '%Y-%m-%d')= T=max((exp -= ref).days= /= 365.25,= 1e-6)= sigma=_safe_float(row.get('iv')) if= sigma= is= None= or= sigma= <=0: sigma=_implied_vol_from_price(S, K,= T,= price,= option_type)= if= S= is= None= or= K= is= None= or= sigma= is= None= or= sigma= <=0: return= None= return= bs_delta(S,= K,= T,= sigma,= option_type)= def= _select_marketdata_contract(chain,= ref_date_str,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= candidates=[] for= c= in= chain:= if= str(c.get('side',= '')).lower()= !=option_type: continue= expiry=_marketdata_date(c.get('expiration')) if= not= expiry:= continue= exp=datetime.strptime(expiry, '%Y-%m-%d')= if= not= (lo= <=exp <=hi): continue= price=_parse_option_price(c) if= price= is= None= or= price= <=0: continue= delta=_marketdata_delta(c, ref_date_str,= expiry,= option_type,= price)= if= delta= is= None= or= delta== 0:= continue= strike=_safe_float(c.get('strike')) symbol=c.get('optionSymbol') if= strike= is= None= or= not= symbol:= continue= candidates.append({= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= delta,= 'price':= price,= })= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= download_option_prices(option_positions,= quarters,= holdings,= filing_dates,= today,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Download= historical= representative= option= prices= from= MarketData.= MarketData= is= the= sole= supported= provider.= MARKETDATA_KEY= must= be= set.= For= each= (ticker,= option_type)= and= each= filing= period= in= which= that= position= is= held:= 1.= On= the= first= trading= day,= select= a= contract= matching= type,= with= expiry= between= ``min_days``= and= ``max_days``= of= the= period= start,= and= |delta|= closest= to= ``delta_target``.= MarketData's= Starter= plan= often= returns= null= Greeks,= so= delta= is= inferred= from= the= observed= mid= price= via= Black-Scholes= when= the= vendor= delta= is= missing.= 2.= Lock= in= that= contract= for= the= period.= 3.= Track= its= historical= mid= price= through= the= period.= The= broad= option= chain= for= each= (ticker,= type,= first_day)= is= cached= to= disk= so= that= sensitivity= sweeps= over= (delta_target,= expiry= window)= reuse= a= single= fetch.= Raises= ``RuntimeError``= if= no= suitable= contract= can= be= selected= for= any= required= (ticker,= type,= period),= or= if= MarketData= returns= no= price= series= for= the= selected= contract.= Parameters= ----------= delta_target= := float= Target= |delta|= for= contract= selection= (default= ``OPTION_DELTA``).= min_days,= max_days= := int= Contract= expiry= window= in= days= from= period= start= (default= 270-456,= i.e.= 9-15= months).= Returns= -------= per_period= := dict= {quarter_str:= {(ticker,= type):= {date_str:= float}}}= Option= prices= keyed= by= filing= period= then= option= position.= Each= period= has= its= own= contract's= prices.= """= option_positions=sorted({ (ticker,= _normalize_option_type(pos_type))= for= ticker,= pos_type= in= option_positions})= md_key=_marketdata_key() os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) per_period={} #= {q:= {(ticker,= type):= {date_str:= price}}}= fetched={'marketdata_chains': 0,= 'marketdata_quotes':= 0}= for= ticker,= option_type= in= option_positions:= opt_key=_option_position_key(ticker, option_type)= cache=_load_option_cache(ticker, option_type,= delta_target,= min_days,= max_days)= new_rows=[] for= i,= q= in= enumerate(quarters):= Skip= quarters= where= this= exact= option= position= is= absent.= if= opt_key= not= in= holdings[q]:= continue= period_start=filing_dates[q] period_end=(filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today)= trading_days=pd.bdate_range(period_start, period_end)= if= len(trading_days)== 0:= continue= first_day=trading_days[0].strftime('%Y-%m-%d') --= Select= contract= on= first= trading= day= --= contract=_select_cached_contract( cache,= option_type,= first_day,= delta_target=delta_target, min_days=min_days, max_days=max_days, require_selected=True) if= contract= is= None:= chain=_get_or_fetch_chain( ticker,= first_day,= option_type,= md_key,= fetched)= contract=_select_marketdata_contract( chain,= first_day,= option_type,= delta_target=delta_target, min_days=min_days, max_days=max_days) if= contract= is= None:= raise= RuntimeError(= f"MarketData= returned= no= usable= {option_type}= contract= "= f"for= {ticker}= on= {first_day}= (period= {q})= at= "= f"delta={delta_target}, "= f"expiry= {min_days}-{max_days}d")= new_rows.append({= 'date':= first_day,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= contract.get('symbol'),= 'strike':= contract['strike'],= 'expiry':= contract['expiry'],= 'delta':= contract['delta'],= 'price':= contract['price'],= })= strike=contract['strike'] expiry=contract['expiry'] symbol=contract.get('symbol') or= _occ_symbol(= ticker,= option_type,= strike,= expiry)= --= Collect= prices= for= this= period= (fresh= dict= per= period)= --= period_prices={} Fast= path:= read= matching= prices= from= cache.= rows=cache[ (cache['date']=>= period_start)
& (cache['date']<= period_end)= &= (cache['option_type']== option_type)= &= (abs(cache['strike']= -= strike)= <= 0.01)= &= (cache['expiry'].astype(str)== str(expiry))= &= pd.notna(cache['price'])]= selected_rows=rows[rows['selected_on'] == first_day]= if= not= selected_rows.empty:= rows=selected_rows for= _,= row= in= rows.iterrows():= period_prices[row['date']]=float(row['price']) Decide= whether= to= refresh= quotes.= With= a= key,= refresh= whenever= the= cached= series= does= not= reach= period_end.= Without= a= key,= only= fail= if= the= cached= series= is= empty;= a= slightly= stale= tail= is= acceptable= for= cache-only= runs= (e.g.= sensitivity= sweeps= replaying= the= baseline= contract).= has_partial=bool(period_prices) reaches_end=has_partial and= max(period_prices)=>= period_end
if md_key and not reaches_end:
time.sleep(_MD_RATE_DELAY)
quote_prices = _fetch_marketdata_quotes(
symbol, period_start, period_end, md_key)
fetched['marketdata_quotes'] += 1
for day_str, price in quote_prices.items():
if period_start<= day_str= <=period_end: period_prices[day_str]=price new_rows.append({= 'date':= day_str,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= contract['delta'],= 'price':= price,= })= if= contract.get('price')= and= first_day= not= in= period_prices:= period_prices[first_day]=contract['price'] elif= not= md_key= and= not= has_partial:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= and= no= cached= quotes= exist= "= f"for= {symbol}= in= {period_start}..{period_end}.")= if= not= period_prices:= raise= RuntimeError(= f"MarketData= returned= no= quotes= for= {symbol}= "= f"({opt_key})= in= {period_start}..{period_end}")= per_period.setdefault(q,= {})[opt_key]=period_prices Persist= new= data= to= cache= if= new_rows:= new_df=pd.DataFrame(new_rows) cache=pd.concat([cache, new_df],= ignore_index=True) cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) _save_option_cache(ticker,= option_type,= cache,= delta_target,= min_days,= max_days)= if= any(fetched.values()):= import= sys= parts=[] if= fetched['marketdata_chains']:= parts.append(f"{fetched['marketdata_chains']}= MarketData= chains")= if= fetched['marketdata_quotes']:= parts.append(f"{fetched['marketdata_quotes']}= MarketData= quote= series")= print(f"[options]= Fetched= {',= '.join(parts)}",= file=sys.stderr) return= per_period= --= Black-Scholes= helpers= (used= only= to= infer= delta= when= MarketData's= Starter-plan= historical= Greeks= are= null;= never= to= reprice= returns)= -----= from= scipy.stats= import= norm= as= _norm= def= bs_price(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= option= price= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= max(S= -= K,= 0)= return= max(K= -= S,= 0)= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= d2=d1 -= sigma= *= np.sqrt(T)= if= option_type== 'call':= return= S= *= _norm.cdf(d1)= -= K= *= _norm.cdf(d2)= return= K= *= _norm.cdf(-d2)= -= S= *= _norm.cdf(-d1)= def= bs_delta(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= delta= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= 1.0= if= S=> K else 0.0
return -1.0 if S< K= else= 0.0= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= if= option_type== 'call':= return= _norm.cdf(d1)= return= _norm.cdf(d1)= -= 1= def= daily_cumulative(holdings,= quarters,= filing_dates,= close,= today,= mode,= per_period_opt=None): """Build= a= daily= series= of= cumulative= growth= factors= for= a= given= mode.= For= each= filing= period,= stock= shares= and= option= contracts= are= fixed.= In= equity-proxy= mode,= option= rows= are= converted= to= linear= underlying= exposure:= calls= are= long= underlying= and= puts= are= short= underlying.= In= option-proxy= mode,= option= rows= are= sized= by= 13F= underlying= notional= and= returns= come= from= MarketData= quotes;= returns= are= divided= by= deployed= capital= (stock= value= plus= option= premium= cost).= Option-proxy= mode= raises= if= MarketData= prices= are= missing= for= any= required= position.= """= cum_growth=1.0 dates_out=[] values_out=[] for= i,= q= in= enumerate(quarters):= period_start=filing_dates[q] period_end=filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today= ps=pd.Timestamp(period_start) pe=pd.Timestamp(period_end) Trading= days= in= this= period= mask=(close.index>= ps) & (close.index<= pe)= period_close=close[mask] if= period_close.empty:= continue= Option= prices= for= this= period= (keyed= by= (ticker,= type)= →= prices)= quarter_opt=per_period_opt.get(q, {})= if= per_period_opt= else= {}= Determine= starting= prices,= fixed= exposure,= and= deployed= capital.= positions=holdings[q] exposure={} costs={} start_prices={} start_underlying={} use_opt_px={} #= track= which= positions= use= option= prices= total_cost=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= opt_key=_option_position_key(ticker, pos_type)= if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value continue= mode== 'full'= (option= proxy)= if= is_option:= if= opt_key= not= in= quarter_opt:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= "= f"period= {q}")= ticker_opt=quarter_opt[opt_key] opt_dates=sorted(d for= d= in= ticker_opt= if= d=>= period_start)
if not opt_dates:
raise RuntimeError(
f"MarketData option prices for {opt_key} in period "
f"{q} contain no dates at or after {period_start}")
if ticker not in close.columns:
raise RuntimeError(
f"No underlying close series for {ticker}")
src = close[ticker].dropna()
avail = src[src.index >= ps]
if avail.empty:
raise RuntimeError(
f"No underlying price for {ticker} at {period_start}")
opt_start = ticker_opt[opt_dates[0]]
underlying_start = float(avail.iloc[0])
if opt_start<= 0= or= underlying_start= <=0: raise= RuntimeError(= f"Non-positive= starting= price= for= {opt_key}= in= "= f"period= {q}")= start_prices[(ticker,= pos_type)]=opt_start start_underlying[(ticker,= pos_type)]=underlying_start costs[(ticker,= pos_type)]=value *= opt_start= /= underlying_start= exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=True total_cost= +=costs[(ticker, pos_type)]= continue= Plain= stock= in= full= mode= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value if= total_cost== 0:= continue= Daily= P&L= relative= to= period= start.= Skip= first= day= of= subsequent= periods= (already= recorded= as= last= day= of= the= prior= period)= to= avoid= duplicate= boundary= dates.= start_idx=1 if= i=> 0 else 0
Forward-fill: track last known option price so that gaps in
option data don't cause positions to vanish mid-period.
last_opt = {k: v for k, v in start_prices.items()
if use_opt_px.get(k)}
for day_idx in range(start_idx, len(period_close)):
day = period_close.index[day_idx]
day_str = day.strftime('%Y-%m-%d')
period_pnl = 0
for (ticker, pos_type), value in exposure.items():
p0 = start_prices[(ticker, pos_type)]
if p0 == 0:
continue
if use_opt_px[(ticker, pos_type)]:
opt_key = _option_position_key(ticker, pos_type)
p1_val = quarter_opt.get(opt_key, {}).get(day_str)
if p1_val is not None:
last_opt[(ticker, pos_type)] = p1_val
else:
p1_val = last_opt.get((ticker, pos_type))
if p1_val is None:
continue
underlying_p0 = start_underlying.get((ticker, pos_type))
if not underlying_p0 or underlying_p0<= 0:= continue= position_pnl=value *= (float(p1_val)= -= p0)= /= underlying_p0= else:= if= ticker= not= in= period_close.columns:= continue= p1_val=period_close[ticker].iloc[day_idx] if= pd.isna(p1_val):= continue= stock_ret=(float(p1_val) -= p0)= /= p0= if= mode== 'equity_only':= position_pnl=( value= *= _linear_underlying_sign(pos_type)= *= stock_ret)= else:= position_pnl=value *= stock_ret= period_pnl= +=position_pnl dates_out.append(day)= values_out.append(cum_growth= *= (1= += period_pnl= /= total_cost))= Chain:= next= period= starts= from= the= last= day's= growth= factor= if= values_out:= cum_growth=values_out[-1] return= dates_out,= values_out= Download= representative= option= prices= from= MarketData= (raises= on= missing= data).= option_positions=sorted({ (t,= pt)= for= q= in= quarters= for= (t,= pt)= in= holdings[q]= if= pt= in= ('call',= 'put')= })= per_period_opt=download_option_prices( option_positions,= quarters,= holdings,= filing_dates,= today)= Compute= copycat= returns= header=(f"{'Period':<16} {'Dates':<24}= "= f"{'Eq.= proxy':=>9} {'Opt. proxy':>10} {'SPY':>9}")
print("COPYCAT STRATEGY RETURNS")
print("=" * 72)
print(header)
print("-" * 72)
cum_eq = 1.0
cum_full = 1.0
cum_spy = 1.0
for i, q in enumerate(quarters):
start = filing_dates[q]
end = filing_dates[quarters[i + 1]] if i< len(quarters)= -= 1= else= today= suffix=" †" if= i== len(quarters)= -= 1= else= ""= ret_eq=compute_return(holdings[q], prices,= start,= end,= 'equity_only')= ret_full=compute_return(holdings[q], prices,= start,= end,= 'full',= option_prices=per_period_opt.get(q, {}))= ret_spy=None if= 'SPY'= in= prices= and= start= in= prices['SPY']= and= end= in= prices['SPY']:= spy_p0,= spy_p1=prices['SPY'][start], prices['SPY'][end]= if= spy_p0= !=0: ret_spy=(spy_p1 -= spy_p0)= /= spy_p0= if= ret_eq= is= not= None:= cum_eq= *=(1 += ret_eq)= if= ret_full= is= not= None:= cum_full= *=(1 += ret_full)= if= ret_spy= is= not= None:= cum_spy= *=(1 += ret_spy)= dates_str=f"{start} to= {end}"= print(f"{q= += suffix:<16}= {dates_str:<24}= "= f"{fmt(ret_eq):=>9} {fmt(ret_full):>9} {fmt(ret_spy):>9}")
print("-" * 72)
cum_eq_ret = cum_eq - 1
cum_full_ret = cum_full - 1
cum_spy_ret = cum_spy - 1
dates_str = f"{first_date} to {today}"
print(f"{'Cumulative':<16} {dates_str:<24}= "= f"{fmt(cum_eq_ret):=>9} {fmt(cum_full_ret):>9} {fmt(cum_spy_ret):>9}")
print()
print("† = partial period (still holding; updates on re-evaluation)")
print("Eq. proxy = stocks plus option rows as linear underlying exposure")
print("Opt. proxy = options sized to 13F notional; returns on deployed capital")
── Risk-adjusted returns ──────────────────────────────────────────
daily_close = download_daily(all_tickers, first_date, today)
def daily_returns_from_cumulative(mode, per_period_opt=None):
if daily_close.empty:
return pd.Series(dtype=float)
dates, values = daily_cumulative(
holdings, quarters, filing_dates, daily_close, today, mode,
per_period_opt=per_period_opt)
if not dates:
return pd.Series(dtype=float)
growth = pd.Series(values, index=dates)
return growth.pct_change().dropna()
ret_eq_d = daily_returns_from_cumulative('equity_only')
ret_full_d = daily_returns_from_cumulative(
'full', per_period_opt=per_period_opt)
if 'SPY' in daily_close.columns:
spy_close = daily_close['SPY'].dropna()
spy_period = spy_close[spy_close.index >= pd.Timestamp(first_date)]
ret_spy_d = spy_period.pct_change().dropna()
else:
ret_spy_d = pd.Series(dtype=float)
def sharpe(daily_rets, rf_annual=0.04):
if daily_rets.empty:
return float('nan')
rf_daily = (1 + rf_annual) ** (1 / 252) - 1
excess = daily_rets - rf_daily
if excess.std() == 0 or pd.isna(excess.std()):
return float('nan')
return float(excess.mean() / excess.std() * 252 ** 0.5)
def max_drawdown(daily_rets):
if daily_rets.empty:
return float('nan')
cum = (1 + daily_rets).cumprod()
return float(((cum - cum.cummax()) / cum.cummax()).min() * 100)
print()
print("RISK-ADJUSTED RETURNS")
print("=" * 55)
print(f"{'Metric':<25} {'Eq.proxy':=>9} {'Opt.proxy':>9} {'SPY':>9}")
print("-" * 55)
vol_eq = float(ret_eq_d.std() * 252 ** 0.5 * 100)
vol_full = float(ret_full_d.std() * 252 ** 0.5 * 100)
vol_spy = float(ret_spy_d.std() * 252 ** 0.5 * 100)
print(f"{'Ann. volatility':<25} {vol_eq:=>8.1f}% {vol_full:>8.1f}% {vol_spy:>8.1f}%")
sh_eq = sharpe(ret_eq_d)
sh_full = sharpe(ret_full_d)
sh_spy = sharpe(ret_spy_d)
print(f"{'Sharpe (rf=4%)':<25} {sh_eq:=>9.2f} {sh_full:>9.2f} {sh_spy:>9.2f}")
mdd_eq = max_drawdown(ret_eq_d)
mdd_full = max_drawdown(ret_full_d)
mdd_spy = max_drawdown(ret_spy_d)
print(f"{'Max drawdown':<25} {mdd_eq:=>8.1f}% {mdd_full:>8.1f}% {mdd_spy:>8.1f}%")
```<a id="code-snippet--sa-chart"/>
```python
import json
import yfinance as yf
import pandas as pd
from datetime import datetime, timedelta
import numpy as np
import requests
import time
import os
import warnings
warnings.filterwarnings('ignore')
Parse data from the scraper block
parsed = json.loads(data) if isinstance(data, str) else data
filings = parsed["filings"]
Build internal structures
filing_dates = {f["quarter"]: f["filing_date"] for f in filings}
quarter_end_dates = {f["quarter"]: f["quarter_end"] for f in filings}
quarters = [f["quarter"] for f in filings]
Convert holdings list to dict keyed by quarter.
Multiple positions in the same ticker with different types are aggregated
by value per (ticker, type) pair.
holdings = {}
for f in filings:
positions = {}
for h in f["holdings"]:
ticker = h["ticker"]
pos_type = h["type"]
value = h["value"]
key = (ticker, pos_type)
positions[key] = positions.get(key, 0) + value
holdings[f["quarter"]] = positions
def _extract_close_series(df, ticker):
"""Extract a single close-price series from a yfinance result."""
if df.empty:
return pd.Series(dtype=float)
if isinstance(df.columns, pd.MultiIndex):
if 'Close' not in df.columns.get_level_values(0):
return pd.Series(dtype=float)
close = df['Close']
if isinstance(close, pd.DataFrame):
if ticker in close.columns:
series = close[ticker]
elif len(close.columns) == 1:
series = close.iloc[:, 0]
else:
return pd.Series(dtype=float)
else:
series = close
elif 'Close' in df.columns:
series = df['Close']
if isinstance(series, pd.DataFrame):
series = series.iloc[:, 0]
else:
return pd.Series(dtype=float)
return pd.to_numeric(series, errors='coerce').dropna()
def _download_close_series(ticker, start, end):
"""Download one ticker's close series; used to repair flaky batch misses."""
df = yf.download(ticker, start=start, end=end, progress=False,
auto_adjust=True)
return _extract_close_series(df, ticker)
def get_prices(tickers, dates):
"""Fetch close prices for tickers on specific dates."""
unique_tickers = sorted(set(tickers))
all_dates = [datetime.strptime(d, '%Y-%m-%d') for d in dates]
start = min(all_dates) - timedelta(days=5)
end = max(all_dates) + timedelta(days=5)
df = yf.download(unique_tickers, start=start, end=end, progress=False, auto_adjust=True)
yf.download returns MultiIndex columns (metric, ticker) for multiple tickers
if df.empty:
close = pd.DataFrame()
elif isinstance(df.columns, pd.MultiIndex) and 'Close' in df.columns.get_level_values(0):
close = df['Close'].copy()
elif 'Close' in df.columns:
close = df[['Close']].copy()
close.columns = unique_tickers
else:
close = pd.DataFrame()
prices = {}
for ticker in unique_tickers:
if ticker in close.columns:
series = pd.to_numeric(close[ticker], errors='coerce').dropna()
else:
series = pd.Series(dtype=float)
if series.empty:
series = _download_close_series(ticker, start, end)
if series.empty:
continue
prices[ticker] = {}
for date_str in dates:
target = pd.Timestamp(datetime.strptime(date_str, '%Y-%m-%d'))
after = series[series.index >= target]
if not after.empty:
prices[ticker][date_str] = float(after.iloc[0])
else:
before = series[series.index<= target]= if= not= before.empty:= prices[ticker][date_str]=float(before.iloc[-1]) return= prices= def= _price_on_or_after(px_by_date,= target_date):= """Return= (date,= price)= for= the= first= available= price= on/after= target."""= if= not= px_by_date:= return= None= dates=sorted(d for= d= in= px_by_date= if= d=>= target_date)
if not dates:
return None
d = dates[0]
return d, px_by_date[d]
def _price_on_or_before(px_by_date, target_date):
"""Return (date, price) for the last available price on/before target."""
if not px_by_date:
return None
dates = sorted(d for d in px_by_date if d<= target_date)= if= not= dates:= return= None= d=dates[-1] return= d,= px_by_date[d]= def= _period_price_pair(px_by_date,= start_date,= end_date):= """Return= start/end= prices= for= a= period= using= sensible= boundary= alignment."""= start=_price_on_or_after(px_by_date, start_date)= end=_price_on_or_before(px_by_date, end_date)= if= start= is= None= or= end= is= None:= return= None= start_actual,= p0=start end_actual,= p1=end if= end_actual= <= start_actual:= return= None= return= start_actual,= end_actual,= p0,= p1= def= _option_position_key(ticker,= pos_type):= return= (ticker,= pos_type)= def= _linear_underlying_sign(pos_type):= """Direction= when= option= rows= are= converted= to= underlying= equity= exposure."""= return= -1= if= pos_type== 'put'= else= 1= def= compute_return(positions,= prices,= start_date,= end_date,= mode='equity_only' ,= option_prices=None): """Compute= portfolio= return= between= two= dates.= The= 13F= value= for= an= option= row= is= treated= as= underlying= notional,= not= option= premium.= Option= contracts= are= sized= from= that= notional,= but= the= portfolio= denominator= is= estimated= deployed= capital:= stock= value= plus= option= premium= cost.= This= avoids= treating= the= gap= between= option= notional= and= option= premium= as= cash.= In= 'full'= mode,= every= option= row= requires= a= MarketData= price= series;= missing= data= raises= rather= than= falling= back.= """= total_cost=0 portfolio_pnl=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= stock_px=prices.get(ticker) if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= total_cost= +=value portfolio_pnl= +=value *= _linear_underlying_sign(pos_type)= *= stock_ret= continue= if= is_option:= opt_key=_option_position_key(ticker, pos_type)= opt_px=option_prices.get(opt_key) if= option_prices= else= None= if= not= opt_px:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= period= "= f"{start_date}..{end_date}")= pair=_period_price_pair(opt_px, start_date,= end_date)= if= pair= is= None:= raise= RuntimeError(= f"MarketData= option= price= series= for= {opt_key}= does= not= "= f"cover= {start_date}..{end_date}")= start_actual,= end_actual,= opt_p0,= opt_p1=pair stock_start=_price_on_or_after(stock_px, start_actual)= if= stock_start= is= None= or= stock_start[1]= <=0: stock_start=_price_on_or_after(stock_px, start_date)= if= stock_start= is= None= or= stock_start[1]= <=0: raise= RuntimeError(= f"No= underlying= price= for= {ticker}= at= {start_date}")= p0,= p1=opt_p0, opt_p1= underlying_p0=stock_start[1] if= p0= <=0 or= underlying_p0= <=0: continue= position_cost=value *= (p0= /= underlying_p0)= position_pnl=value *= ((p1= -= p0)= /= underlying_p0)= else:= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= position_cost=value position_pnl=value *= stock_ret= if= position_cost= <=0: continue= total_cost= +=position_cost portfolio_pnl= +=position_pnl return= portfolio_pnl= /= total_cost= if= total_cost= else= None= def= annualize(ret,= days):= """Annualize= a= return= over= a= given= number= of= calendar= days."""= if= ret= is= None= or= days= <=0: return= None= return= (1= += ret)= **= (365.25= /= days)= -= 1= def= fmt(ret):= return= f"{ret= *= 100:+.2f}%"= if= ret= is= not= None= else= "N/A"= Collect= all= tickers= and= dates= all_tickers=set() for= positions= in= holdings.values():= for= (ticker,= _)= in= positions:= all_tickers.add(ticker)= all_tickers.add('SPY')= today=datetime.now().strftime('%Y-%m-%d') first_date=filing_dates[quarters[0]] all_dates=set(filing_dates.values()) |= set(quarter_end_dates.values())= |= {today}= prices=get_prices(sorted(all_tickers), sorted(all_dates))= Resolve= `today`= to= the= actual= last= available= closing= date.= yfinance= may= not= have= data= for= today= (market= still= open= or= holiday),= so= we= look= up= what= date= SPY's= price= actually= corresponds= to.= def= _resolve_price_date(prices,= requested_date):= """Return= the= actual= trading= date= of= the= price= stored= under= requested_date."""= ref='SPY' if= 'SPY'= in= prices= else= next(iter(prices),= None)= if= not= ref= or= requested_date= not= in= prices[ref]:= return= requested_date= target_price=prices[ref][requested_date] Re-download= a= small= window= to= find= the= real= date= of= this= price= start=datetime.strptime(requested_date, '%Y-%m-%d')= -= timedelta(days=10) end=datetime.strptime(requested_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(ref, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= return= requested_date= if= isinstance(df.columns,= pd.MultiIndex):= close=df['Close'][ref].dropna() elif= 'Close'= in= df.columns:= close=df['Close'].dropna() else:= close=df.iloc[:, 0].dropna()= for= dt,= px= in= close.items():= val=float(px.iloc[0]) if= isinstance(px,= pd.Series)= else= float(px)= if= abs(val= -= target_price)= <= 0.01:= ts=dt[0] if= isinstance(dt,= tuple)= else= dt= return= pd.Timestamp(ts).strftime('%Y-%m-%d')= return= requested_date= today_resolved=_resolve_price_date(prices, today)= if= today_resolved= !=today: for= ticker= in= prices:= if= today= in= prices[ticker]:= prices[ticker][today_resolved]=prices[ticker].pop(today) today=today_resolved def= download_daily(tickers,= start_date,= end_date):= """Download= daily= close= prices= from= yfinance,= handling= MultiIndex.= Dates= are= 'YYYY-MM-DD'= strings.= Adds= a= small= buffer= for= trading-day= alignment."""= tickers_sorted=sorted(tickers) start=datetime.strptime(start_date, '%Y-%m-%d')= -= timedelta(days=5) end=datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(tickers_sorted, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= close=pd.DataFrame() elif= isinstance(df.columns,= pd.MultiIndex)= and= 'Close'= in= df.columns.get_level_values(0):= close=df['Close'].copy() elif= 'Close'= in= df.columns:= close=df[['Close']].copy() close.columns=tickers_sorted else:= close=pd.DataFrame() for= ticker= in= tickers_sorted:= if= ticker= in= close.columns= and= not= close[ticker].dropna().empty:= continue= series=_download_close_series(ticker, start,= end)= if= not= series.empty:= close[ticker]=series return= close.sort_index()= --= Historical= option= prices= via= MarketData= --------------------------------= OPTION_CACHE_DIR=os.path.expanduser('~/My Drive/notes/.sa-lp-option-cache')= _MD_BASE='https://api.marketdata.app/v1' _MD_RATE_DELAY=0.15 OPTION_CACHE_COLUMNS=[ 'date',= 'selected_on',= 'option_type',= 'symbol',= 'strike',= 'expiry',= 'delta',= 'price']= Default= contract= selection= parameters.= The= option= proxy= picks= a= contract= matching= option= type,= with= expiry= between= min_days= and= max_days= of= the= period= start,= and= |delta|= closest= to= delta_target.= When= the= chain= is= sparse,= the= achieved= |delta|= may= be= far= from= the= target;= the= sensitivity= block= reports= achieved= |delta|= so= this= is= visible= rather= than= silent.= OPTION_DELTA=0.15 EXPIRY_MIN_DAYS=270 #= ~9= months= EXPIRY_MAX_DAYS=456 #= ~15= months= def= _normalize_option_type(option_type):= option_type=str(option_type).lower() if= option_type= not= in= ('call',= 'put'):= raise= ValueError(f"Unsupported= option= type:= {option_type}")= return= option_type= def= _empty_option_cache():= return= pd.DataFrame(columns=OPTION_CACHE_COLUMNS) def= _option_cache_path(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Return= the= cache= CSV= path= for= (ticker,= type,= delta_target,= window).= When= the= parameter= triple= equals= the= baseline= (0.15,= 270-456= days),= the= historical= filename= ``TICKER-TYPE.csv``= is= used= so= the= main-backtest= cache= is= reused= automatically.= Any= non-baseline= combo= lives= in= a= separate= ``TICKER-TYPE-d<delta=>-e<min>-<max>.csv`` file so a sensitivity
sweep never pollutes the baseline cache (which the portfolio calculator
reads to pick the representative contract for the current filing).
"""
option_type = _normalize_option_type(option_type)
is_baseline = (
abs(delta_target - OPTION_DELTA)< 1e-9= and= min_days== EXPIRY_MIN_DAYS= and= max_days== EXPIRY_MAX_DAYS)= if= is_baseline:= return= os.path.join(OPTION_CACHE_DIR,= f'{ticker}-{option_type}.csv')= return= os.path.join(= OPTION_CACHE_DIR,= f'{ticker}-{option_type}-d{delta_target:g}-e{min_days}-{max_days}.csv')= def= _load_option_cache(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Load= cached= MarketData= rows= for= a= ticker/type/target/window.= Returns= DataFrame= or= empty."""= option_type=_normalize_option_type(option_type) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= not= os.path.exists(path):= return= _empty_option_cache()= df=pd.read_csv(path) if= df.empty:= return= _empty_option_cache()= for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan for= col= in= ('date',= 'selected_on'):= df[col]=pd.to_datetime( df[col],= errors='coerce' ).dt.strftime('%Y-%m-%d')= df['option_type']=df['option_type'].fillna(option_type).str.lower() cache=df[OPTION_CACHE_COLUMNS].copy() cache=cache[cache['option_type'] == option_type].copy()= cache.dropna(subset=['date'], inplace=True) for= col= in= ('strike',= 'delta',= 'price'):= cache[col]=pd.to_numeric(cache[col], errors='coerce' )= cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) return= cache[OPTION_CACHE_COLUMNS]= def= _save_option_cache(ticker,= option_type,= df,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Persist= typed= option= cache= to= CSV."""= option_type=_normalize_option_type(option_type) os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= df.empty:= df=_empty_option_cache() else:= df=df.copy() df['option_type']=option_type for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan df.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) df.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) df.to_csv(path,= index=False) def= _contract_window(ref_date_str,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= return= ref= += timedelta(days=min_days), ref= += timedelta(days=max_days) def= _contract_from_cache_row(row,= ref_date_str,= option_type,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) if= str(row.get('option_type',= option_type)).lower()= !=option_type: return= None= lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= try:= exp=datetime.strptime(str(row['expiry']), '%Y-%m-%d')= except= (KeyError,= TypeError,= ValueError):= return= None= if= not= (lo= <=exp <=hi): return= None= strike=_safe_float(row.get('strike')) delta=_safe_float(row.get('delta')) price=_safe_float(row.get('price')) if= strike= is= None= or= delta= is= None= or= price= is= None= or= price= <=0: return= None= return= {= 'selected_on':= row.get('selected_on'),= 'option_type':= option_type,= 'symbol':= row.get('symbol'),= 'strike':= strike,= 'expiry':= str(row['expiry']),= 'delta':= delta,= 'price':= price,= }= def= _select_cached_contract(cache,= option_type,= ref_date_str,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS, require_selected=False): rows=cache[(cache['date'] == ref_date_str)= &= (cache['option_type']== option_type)]= selected_rows=rows[rows['selected_on'] == ref_date_str]= if= not= selected_rows.empty:= rows=selected_rows elif= require_selected:= rows=selected_rows candidates=[] for= _,= row= in= rows.iterrows():= contract=_contract_from_cache_row(row, ref_date_str,= option_type,= min_days,= max_days)= if= contract:= candidates.append(contract)= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= _parse_option_price(contract):= """Extract= a= mark= price= from= an= option= contract= record."""= mid=_safe_float(contract.get('mid')) if= mid= and= mid=> 0:
return mid
bid = _safe_float(contract.get('bid'))
ask = _safe_float(contract.get('ask'))
last = _safe_float(contract.get('last'))
if bid and ask and bid > 0 and ask > 0:
return (bid + ask) / 2
if last and last > 0:
return last
return None
def _safe_float(val):
try:
out = float(val)
if np.isnan(out):
return None
return out
except (TypeError, ValueError):
return None
def _marketdata_key():
"""Return the MarketData API key, or None if unavailable.
Resolution order:
1. ``MARKETDATA_KEY`` / ``MARKETDATA_API_KEY`` environment variables.
2. ``pass env/marketdata-token`` (local ``pass`` store).
The result is memoised on the function object so repeated lookups
during a sweep do not reshell. Fetch helpers raise themselves when
called without a key, so a fully cached run still succeeds without
requiring either source.
"""
if hasattr(_marketdata_key, '_cached'):
return _marketdata_key._cached
key = (os.environ.get('MARKETDATA_KEY', '')
or os.environ.get('MARKETDATA_API_KEY', ''))
if not key:
try:
import subprocess
out = subprocess.run(
['pass', 'show', 'env/marketdata-token'],
capture_output=True, text=True, timeout=5, check=False)
if out.returncode == 0:
key = out.stdout.strip().splitlines()[0] if out.stdout else ''
except (FileNotFoundError, subprocess.TimeoutExpired):
key = ''
_marketdata_key._cached = key or None
return _marketdata_key._cached
def _marketdata_get(path, params, api_key):
"""Fetch a MarketData endpoint, returning normalized row dictionaries.
Raises on HTTP errors or a non-'ok' status. 'no_data' is returned as
an empty list so that callers can distinguish 'nothing available' from
'request failed'.
"""
headers = {'Accept': 'application/json', 'Authorization': f'Bearer {api_key}'}
resp = requests.get(_MD_BASE + path, params=params, headers=headers,
timeout=30)
resp.raise_for_status()
body = resp.json()
status = body.get('s')
if status == 'no_data':
return []
if status != 'ok':
raise RuntimeError(
f"MarketData {path} returned status={status!r}: "
f"{body.get('errmsg') or body}")
lengths = [len(v) for v in body.values() if isinstance(v, list)]
n = max(lengths) if lengths else 0
rows = []
for i in range(n):
row = {}
for key, val in body.items():
if isinstance(val, list):
row[key] = val[i] if i< len(val)= else= None= else:= row[key]=val rows.append(row)= return= rows= def= _marketdata_date(timestamp):= try:= return= datetime.utcfromtimestamp(int(timestamp)).strftime('%Y-%m-%d')= except= (TypeError,= ValueError,= OSError):= return= None= def= _occ_symbol(ticker,= option_type,= strike,= expiry):= """Build= a= standard= OCC= option= symbol= from= contract= fields."""= cp='C' if= _normalize_option_type(option_type)== 'call'= else= 'P'= exp=datetime.strptime(str(expiry), '%Y-%m-%d').strftime('%y%m%d')= strike_int=int(round(float(strike) *= 1000))= root=ticker.upper().replace('.', '')= return= f'{root}{exp}{cp}{strike_int:08d}'= Chains= are= always= fetched= with= a= broad= expiry= window= so= they= can= be= cached= and= reused= for= in-memory= selection= across= any= (delta_target,= expiry= window)= combination= in= the= sensitivity= sweep.= CHAIN_FETCH_MIN_DAYS=30 CHAIN_FETCH_MAX_DAYS=760 def= _fetch_marketdata_chain(ticker,= date_str,= option_type,= api_key,= min_days=CHAIN_FETCH_MIN_DAYS, max_days=CHAIN_FETCH_MAX_DAYS): lo,= hi=_contract_window(date_str, min_days,= max_days)= params={ 'date':= date_str,= 'from':= lo.strftime('%Y-%m-%d'),= 'to':= hi.strftime('%Y-%m-%d'),= 'side':= _normalize_option_type(option_type),= 'expiration':= 'all',= }= return= _marketdata_get(f'/options/chain/{ticker}/',= params,= api_key)= Chain= cache:= one= CSV= per= (ticker,= type,= date)= storing= the= broad-window= chain.= Lets= the= sensitivity= sweep= re-select= contracts= for= different= delta= targets= and= expiry= windows= without= refetching.= CHAIN_CACHE_DIR=os.path.join(OPTION_CACHE_DIR, 'chains')= def= _chain_cache_path(ticker,= option_type,= date_str):= option_type=_normalize_option_type(option_type) return= os.path.join(CHAIN_CACHE_DIR,= f'{ticker}-{option_type}-{date_str}.csv')= def= _load_chain_cache(ticker,= option_type,= date_str):= path=_chain_cache_path(ticker, option_type,= date_str)= if= not= os.path.exists(path):= return= None= df=pd.read_csv(path) if= df.empty:= return= []= return= df.to_dict('records')= def= _save_chain_cache(ticker,= option_type,= date_str,= chain):= if= not= chain:= return= os.makedirs(CHAIN_CACHE_DIR,= exist_ok=True) path=_chain_cache_path(ticker, option_type,= date_str)= pd.DataFrame(chain).to_csv(path,= index=False) def= _get_or_fetch_chain(ticker,= date_str,= option_type,= api_key,= fetched_counter=None): """Return= the= cached= broad= chain= for= (ticker,= type,= date),= fetching= if= absent.= Requires= ``api_key``= only= when= a= fetch= is= actually= needed.= """= chain=_load_chain_cache(ticker, option_type,= date_str)= if= chain= is= not= None:= return= chain= if= not= api_key:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= but= a= chain= fetch= is= required= for= "= f"{ticker}= {option_type}= on= {date_str}.")= time.sleep(_MD_RATE_DELAY)= chain=_fetch_marketdata_chain(ticker, date_str,= option_type,= api_key)= if= fetched_counter= is= not= None:= fetched_counter['marketdata_chains']= +=1 _save_chain_cache(ticker,= option_type,= date_str,= chain)= return= chain= def= _fetch_marketdata_quotes(symbol,= start_date,= end_date,= api_key):= to_date=(datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=1)).strftime('%Y-%m-%d') rows=_marketdata_get(f'/options/quotes/{symbol}/', {'from':= start_date,= 'to':= to_date},= api_key)= prices={} for= row= in= rows:= date_str=_marketdata_date(row.get('updated')) if= not= date_str:= continue= price=_parse_option_price(row) if= price= is= not= None= and= price=> 0:
prices[date_str] = price
return prices
def _implied_vol_from_price(S, K, T, option_price, option_type):
"""Infer Black-Scholes volatility from an observed option mid price."""
if any(x is None for x in (S, K, T, option_price)):
return None
if S<= 0= or= K= <=0 or= T= <=0 or= option_price= <=0: return= None= intrinsic=max(S -= K,= 0)= if= option_type== 'call'= else= max(K= -= S,= 0)= upper=S if= option_type== 'call'= else= K= if= option_price= <= intrinsic= -= 1e-6= or= option_price=> upper * 1.5:
return None
lo, hi = 1e-4, 5.0
try:
if (option_price< bs_price(S,= K,= T,= lo,= option_type)= -= 1e-4= or= option_price=> bs_price(S, K, T, hi, option_type) + 1e-4):
return None
for _ in range(80):
mid = (lo + hi) / 2
if bs_price(S, K, T, mid, option_type)< option_price:= lo=mid else:= hi=mid return= (lo= += hi)= /= 2= except= (FloatingPointError,= ValueError,= ZeroDivisionError):= return= None= def= _marketdata_delta(row,= ref_date_str,= expiry,= option_type,= price):= """Use= vendor= delta= when= present;= otherwise= infer= it= from= the= quote."""= native=_safe_float(row.get('delta')) if= native= is= not= None= and= native= !=0: return= native= S=_safe_float(row.get('underlyingPrice')) K=_safe_float(row.get('strike')) ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= exp=datetime.strptime(expiry, '%Y-%m-%d')= T=max((exp -= ref).days= /= 365.25,= 1e-6)= sigma=_safe_float(row.get('iv')) if= sigma= is= None= or= sigma= <=0: sigma=_implied_vol_from_price(S, K,= T,= price,= option_type)= if= S= is= None= or= K= is= None= or= sigma= is= None= or= sigma= <=0: return= None= return= bs_delta(S,= K,= T,= sigma,= option_type)= def= _select_marketdata_contract(chain,= ref_date_str,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= candidates=[] for= c= in= chain:= if= str(c.get('side',= '')).lower()= !=option_type: continue= expiry=_marketdata_date(c.get('expiration')) if= not= expiry:= continue= exp=datetime.strptime(expiry, '%Y-%m-%d')= if= not= (lo= <=exp <=hi): continue= price=_parse_option_price(c) if= price= is= None= or= price= <=0: continue= delta=_marketdata_delta(c, ref_date_str,= expiry,= option_type,= price)= if= delta= is= None= or= delta== 0:= continue= strike=_safe_float(c.get('strike')) symbol=c.get('optionSymbol') if= strike= is= None= or= not= symbol:= continue= candidates.append({= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= delta,= 'price':= price,= })= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= download_option_prices(option_positions,= quarters,= holdings,= filing_dates,= today,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Download= historical= representative= option= prices= from= MarketData.= MarketData= is= the= sole= supported= provider.= MARKETDATA_KEY= must= be= set.= For= each= (ticker,= option_type)= and= each= filing= period= in= which= that= position= is= held:= 1.= On= the= first= trading= day,= select= a= contract= matching= type,= with= expiry= between= ``min_days``= and= ``max_days``= of= the= period= start,= and= |delta|= closest= to= ``delta_target``.= MarketData's= Starter= plan= often= returns= null= Greeks,= so= delta= is= inferred= from= the= observed= mid= price= via= Black-Scholes= when= the= vendor= delta= is= missing.= 2.= Lock= in= that= contract= for= the= period.= 3.= Track= its= historical= mid= price= through= the= period.= The= broad= option= chain= for= each= (ticker,= type,= first_day)= is= cached= to= disk= so= that= sensitivity= sweeps= over= (delta_target,= expiry= window)= reuse= a= single= fetch.= Raises= ``RuntimeError``= if= no= suitable= contract= can= be= selected= for= any= required= (ticker,= type,= period),= or= if= MarketData= returns= no= price= series= for= the= selected= contract.= Parameters= ----------= delta_target= := float= Target= |delta|= for= contract= selection= (default= ``OPTION_DELTA``).= min_days,= max_days= := int= Contract= expiry= window= in= days= from= period= start= (default= 270-456,= i.e.= 9-15= months).= Returns= -------= per_period= := dict= {quarter_str:= {(ticker,= type):= {date_str:= float}}}= Option= prices= keyed= by= filing= period= then= option= position.= Each= period= has= its= own= contract's= prices.= """= option_positions=sorted({ (ticker,= _normalize_option_type(pos_type))= for= ticker,= pos_type= in= option_positions})= md_key=_marketdata_key() os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) per_period={} #= {q:= {(ticker,= type):= {date_str:= price}}}= fetched={'marketdata_chains': 0,= 'marketdata_quotes':= 0}= for= ticker,= option_type= in= option_positions:= opt_key=_option_position_key(ticker, option_type)= cache=_load_option_cache(ticker, option_type,= delta_target,= min_days,= max_days)= new_rows=[] for= i,= q= in= enumerate(quarters):= Skip= quarters= where= this= exact= option= position= is= absent.= if= opt_key= not= in= holdings[q]:= continue= period_start=filing_dates[q] period_end=(filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today)= trading_days=pd.bdate_range(period_start, period_end)= if= len(trading_days)== 0:= continue= first_day=trading_days[0].strftime('%Y-%m-%d') --= Select= contract= on= first= trading= day= --= contract=_select_cached_contract( cache,= option_type,= first_day,= delta_target=delta_target, min_days=min_days, max_days=max_days, require_selected=True) if= contract= is= None:= chain=_get_or_fetch_chain( ticker,= first_day,= option_type,= md_key,= fetched)= contract=_select_marketdata_contract( chain,= first_day,= option_type,= delta_target=delta_target, min_days=min_days, max_days=max_days) if= contract= is= None:= raise= RuntimeError(= f"MarketData= returned= no= usable= {option_type}= contract= "= f"for= {ticker}= on= {first_day}= (period= {q})= at= "= f"delta={delta_target}, "= f"expiry= {min_days}-{max_days}d")= new_rows.append({= 'date':= first_day,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= contract.get('symbol'),= 'strike':= contract['strike'],= 'expiry':= contract['expiry'],= 'delta':= contract['delta'],= 'price':= contract['price'],= })= strike=contract['strike'] expiry=contract['expiry'] symbol=contract.get('symbol') or= _occ_symbol(= ticker,= option_type,= strike,= expiry)= --= Collect= prices= for= this= period= (fresh= dict= per= period)= --= period_prices={} Fast= path:= read= matching= prices= from= cache.= rows=cache[ (cache['date']=>= period_start)
& (cache['date']<= period_end)= &= (cache['option_type']== option_type)= &= (abs(cache['strike']= -= strike)= <= 0.01)= &= (cache['expiry'].astype(str)== str(expiry))= &= pd.notna(cache['price'])]= selected_rows=rows[rows['selected_on'] == first_day]= if= not= selected_rows.empty:= rows=selected_rows for= _,= row= in= rows.iterrows():= period_prices[row['date']]=float(row['price']) Decide= whether= to= refresh= quotes.= With= a= key,= refresh= whenever= the= cached= series= does= not= reach= period_end.= Without= a= key,= only= fail= if= the= cached= series= is= empty;= a= slightly= stale= tail= is= acceptable= for= cache-only= runs= (e.g.= sensitivity= sweeps= replaying= the= baseline= contract).= has_partial=bool(period_prices) reaches_end=has_partial and= max(period_prices)=>= period_end
if md_key and not reaches_end:
time.sleep(_MD_RATE_DELAY)
quote_prices = _fetch_marketdata_quotes(
symbol, period_start, period_end, md_key)
fetched['marketdata_quotes'] += 1
for day_str, price in quote_prices.items():
if period_start<= day_str= <=period_end: period_prices[day_str]=price new_rows.append({= 'date':= day_str,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= contract['delta'],= 'price':= price,= })= if= contract.get('price')= and= first_day= not= in= period_prices:= period_prices[first_day]=contract['price'] elif= not= md_key= and= not= has_partial:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= and= no= cached= quotes= exist= "= f"for= {symbol}= in= {period_start}..{period_end}.")= if= not= period_prices:= raise= RuntimeError(= f"MarketData= returned= no= quotes= for= {symbol}= "= f"({opt_key})= in= {period_start}..{period_end}")= per_period.setdefault(q,= {})[opt_key]=period_prices Persist= new= data= to= cache= if= new_rows:= new_df=pd.DataFrame(new_rows) cache=pd.concat([cache, new_df],= ignore_index=True) cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) _save_option_cache(ticker,= option_type,= cache,= delta_target,= min_days,= max_days)= if= any(fetched.values()):= import= sys= parts=[] if= fetched['marketdata_chains']:= parts.append(f"{fetched['marketdata_chains']}= MarketData= chains")= if= fetched['marketdata_quotes']:= parts.append(f"{fetched['marketdata_quotes']}= MarketData= quote= series")= print(f"[options]= Fetched= {',= '.join(parts)}",= file=sys.stderr) return= per_period= --= Black-Scholes= helpers= (used= only= to= infer= delta= when= MarketData's= Starter-plan= historical= Greeks= are= null;= never= to= reprice= returns)= -----= from= scipy.stats= import= norm= as= _norm= def= bs_price(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= option= price= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= max(S= -= K,= 0)= return= max(K= -= S,= 0)= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= d2=d1 -= sigma= *= np.sqrt(T)= if= option_type== 'call':= return= S= *= _norm.cdf(d1)= -= K= *= _norm.cdf(d2)= return= K= *= _norm.cdf(-d2)= -= S= *= _norm.cdf(-d1)= def= bs_delta(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= delta= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= 1.0= if= S=> K else 0.0
return -1.0 if S< K= else= 0.0= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= if= option_type== 'call':= return= _norm.cdf(d1)= return= _norm.cdf(d1)= -= 1= def= daily_cumulative(holdings,= quarters,= filing_dates,= close,= today,= mode,= per_period_opt=None): """Build= a= daily= series= of= cumulative= growth= factors= for= a= given= mode.= For= each= filing= period,= stock= shares= and= option= contracts= are= fixed.= In= equity-proxy= mode,= option= rows= are= converted= to= linear= underlying= exposure:= calls= are= long= underlying= and= puts= are= short= underlying.= In= option-proxy= mode,= option= rows= are= sized= by= 13F= underlying= notional= and= returns= come= from= MarketData= quotes;= returns= are= divided= by= deployed= capital= (stock= value= plus= option= premium= cost).= Option-proxy= mode= raises= if= MarketData= prices= are= missing= for= any= required= position.= """= cum_growth=1.0 dates_out=[] values_out=[] for= i,= q= in= enumerate(quarters):= period_start=filing_dates[q] period_end=filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today= ps=pd.Timestamp(period_start) pe=pd.Timestamp(period_end) Trading= days= in= this= period= mask=(close.index>= ps) & (close.index<= pe)= period_close=close[mask] if= period_close.empty:= continue= Option= prices= for= this= period= (keyed= by= (ticker,= type)= →= prices)= quarter_opt=per_period_opt.get(q, {})= if= per_period_opt= else= {}= Determine= starting= prices,= fixed= exposure,= and= deployed= capital.= positions=holdings[q] exposure={} costs={} start_prices={} start_underlying={} use_opt_px={} #= track= which= positions= use= option= prices= total_cost=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= opt_key=_option_position_key(ticker, pos_type)= if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value continue= mode== 'full'= (option= proxy)= if= is_option:= if= opt_key= not= in= quarter_opt:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= "= f"period= {q}")= ticker_opt=quarter_opt[opt_key] opt_dates=sorted(d for= d= in= ticker_opt= if= d=>= period_start)
if not opt_dates:
raise RuntimeError(
f"MarketData option prices for {opt_key} in period "
f"{q} contain no dates at or after {period_start}")
if ticker not in close.columns:
raise RuntimeError(
f"No underlying close series for {ticker}")
src = close[ticker].dropna()
avail = src[src.index >= ps]
if avail.empty:
raise RuntimeError(
f"No underlying price for {ticker} at {period_start}")
opt_start = ticker_opt[opt_dates[0]]
underlying_start = float(avail.iloc[0])
if opt_start<= 0= or= underlying_start= <=0: raise= RuntimeError(= f"Non-positive= starting= price= for= {opt_key}= in= "= f"period= {q}")= start_prices[(ticker,= pos_type)]=opt_start start_underlying[(ticker,= pos_type)]=underlying_start costs[(ticker,= pos_type)]=value *= opt_start= /= underlying_start= exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=True total_cost= +=costs[(ticker, pos_type)]= continue= Plain= stock= in= full= mode= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value if= total_cost== 0:= continue= Daily= P&L= relative= to= period= start.= Skip= first= day= of= subsequent= periods= (already= recorded= as= last= day= of= the= prior= period)= to= avoid= duplicate= boundary= dates.= start_idx=1 if= i=> 0 else 0
Forward-fill: track last known option price so that gaps in
option data don't cause positions to vanish mid-period.
last_opt = {k: v for k, v in start_prices.items()
if use_opt_px.get(k)}
for day_idx in range(start_idx, len(period_close)):
day = period_close.index[day_idx]
day_str = day.strftime('%Y-%m-%d')
period_pnl = 0
for (ticker, pos_type), value in exposure.items():
p0 = start_prices[(ticker, pos_type)]
if p0 == 0:
continue
if use_opt_px[(ticker, pos_type)]:
opt_key = _option_position_key(ticker, pos_type)
p1_val = quarter_opt.get(opt_key, {}).get(day_str)
if p1_val is not None:
last_opt[(ticker, pos_type)] = p1_val
else:
p1_val = last_opt.get((ticker, pos_type))
if p1_val is None:
continue
underlying_p0 = start_underlying.get((ticker, pos_type))
if not underlying_p0 or underlying_p0<= 0:= continue= position_pnl=value *= (float(p1_val)= -= p0)= /= underlying_p0= else:= if= ticker= not= in= period_close.columns:= continue= p1_val=period_close[ticker].iloc[day_idx] if= pd.isna(p1_val):= continue= stock_ret=(float(p1_val) -= p0)= /= p0= if= mode== 'equity_only':= position_pnl=( value= *= _linear_underlying_sign(pos_type)= *= stock_ret)= else:= position_pnl=value *= stock_ret= period_pnl= +=position_pnl dates_out.append(day)= values_out.append(cum_growth= *= (1= += period_pnl= /= total_cost))= Chain:= next= period= starts= from= the= last= day's= growth= factor= if= values_out:= cum_growth=values_out[-1] return= dates_out,= values_out= import= plotly.graph_objects= as= go= HUGO_BASE=os.path.expanduser('~/My Drive/repos/stafforini.com')= ──= Fetch= daily= prices= ────────────────────────────────────────────= close=download_daily(all_tickers, first_date,= today)= dates_eq,= vals_eq=daily_cumulative( holdings,= quarters,= filing_dates,= close,= today,= 'equity_only')= ──= Option= proxy= with= representative= notional-matched= options= ───────= option_positions=sorted({ (t,= pt)= for= q= in= quarters= for= (t,= pt)= in= holdings[q]= if= pt= in= ('call',= 'put')= })= per_period_opt=download_option_prices( option_positions,= quarters,= holdings,= filing_dates,= today)= dates_full,= vals_full=daily_cumulative( holdings,= quarters,= filing_dates,= close,= today,= 'full',= per_period_opt=per_period_opt) ──= Compute= SPY= benchmark= ─────────────────────────────────────────= spy_series=close['SPY'].dropna() spy_start=spy_series[spy_series.index>= pd.Timestamp(first_date)]
if not spy_start.empty:
spy_p0 = float(spy_start.iloc[0])
spy_dates = spy_start.index.tolist()
spy_vals = [float(p) / spy_p0 for p in spy_start.values]
else:
spy_dates, spy_vals = [], []
── Plot with Plotly ───────────────────────────────────────────────
eq_pct = [round((v - 1) * 100, 1) for v in vals_eq]
full_pct = [round((v - 1) * 100, 1) for v in vals_full]
spy_pct = [round((v - 1) * 100, 1) for v in spy_vals]
fig = go.Figure()
fig.add_trace(go.Scatter(
x=dates_eq, y=eq_pct, mode='lines',
name='Equity proxy',
line=dict(color='#2563eb', width=2)))
fig.add_trace(go.Scatter(
x=dates_full, y=full_pct, mode='lines',
name='Option proxy',
line=dict(color='#dc2626', width=2)))
fig.add_trace(go.Scatter(
x=spy_dates, y=spy_pct, mode='lines',
name='S&P 500 (SPY)',
line=dict(color='#16a34a', width=2, dash='dot')))
Vertical lines at filing dates (rebalancing points)
for fd in filing_dates.values():
fig.add_vline(x=fd, line=dict(color='gray', width=0.5), opacity=0.4)
fig.add_hline(y=0, line=dict(color='gray', width=0.8))
fig.update_layout(
title=dict(text='SA LP copycat: cumulative returns',
font=dict(size=15)),
yaxis=dict(title='Cumulative return', hoverformat='+.1f',
ticksuffix='%'),
hovermode='x unified',
xaxis=dict(spikemode='across', spikethickness=0.5,
spikedash='solid', spikecolor='gray'),
template='plotly_white',
legend=dict(x=0.02, y=0.98, bgcolor='rgba(255,255,255,0.8)'),
margin=dict(l=60, r=20, t=50, b=40),
height=500,
)
── Generate HTML with dark-mode support ──────────────────────────
import re
chart_html = fig.to_html(full_html=False, include_plotlyjs='cdn',
config={'responsive': True, 'displayModeBar': False})
div_id = re.search(r'id="([^"]+)"', chart_html).group(1)
dark_script = """<script>
(function() {
var gd = document.getElementById('%s');
function isDark() {
try { return parent.document.documentElement.getAttribute('data-theme') === 'dark'; }
catch(e) { return window.matchMedia('(prefers-color-scheme: dark)').matches; }
}
function apply() {
var dk = isDark();
Plotly.relayout(gd, {
paper_bgcolor: 'rgba(0,0,0,0)',
plot_bgcolor: dk ? 'rgba(30,30,30,0.5)' : 'rgba(255,255,255,0.8)',
font: {color: dk ? '#d4d4d4' : '#333'},
'title.font.color': dk ? '#d4d4d4' : '#333',
'xaxis.gridcolor': dk ? 'rgba(255,255,255,0.1)' : 'rgba(0,0,0,0.1)',
'yaxis.gridcolor': dk ? 'rgba(255,255,255,0.1)' : 'rgba(0,0,0,0.1)',
'legend.bgcolor': dk ? 'rgba(30,30,30,0.8)' : 'rgba(255,255,255,0.8)',
'legend.font.color': dk ? '#d4d4d4' : '#333',
});
}
apply();
new MutationObserver(function() { apply(); }).observe(
parent.document.documentElement, {attributes: true, attributeFilter: ['data-theme']});
})();</script>""" % div_id
outpath = os.path.join(HUGO_BASE, 'static', 'images', 'sa-lp-returns.html')
with open(outpath, 'w') as f:
f.write('<!DOCTYPE html>\n<html>\n<head><meta charset="utf-8">\n'
'<meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate">\n'
'<style>body { margin: 0; background: transparent; }</style>\n'
'</head>\n<body>\n' + chart_html + dark_script +
'\n</body>\n</html>')
```</div></details>
```text
COPYCAT STRATEGY RETURNS
========================================================================
Period Dates Eq. proxy Opt. proxy SPY
------------------------------------------------------------------------
Q4_2024 2025-02-12 to 2025-05-14 -14.73% -14.73% -2.32%
Q1_2025 2025-05-14 to 2025-08-14 +24.14% +35.28% +10.09%
Q2_2025 2025-08-14 to 2025-11-14 +16.45% +22.37% +4.47%
Q3_2025 2025-11-14 to 2026-02-11 +14.54% +19.94% +3.29%
Q4_2025 † 2026-02-11 to 2026-04-15 +25.70% +25.14% +1.43%
------------------------------------------------------------------------
Cumulative 2025-02-12 to 2026-04-15 +77.49% +111.90% +17.69%
† = partial period (still holding; updates on re-evaluation)
Eq. proxy = stocks plus option rows as linear underlying exposure
Opt. proxy = options sized to 13F notional; returns on deployed capital
RISK-ADJUSTED RETURNS
=======================================================
Metric Eq.proxy Opt.proxy SPY
-------------------------------------------------------
Ann. volatility 52.4% 61.5% 19.0%
Sharpe (rf=4%) 1.13 1.30 0.63
Max drawdown -45.8% -45.8% -18.8%
```<iframe id="sa-lp-chart" width="100%" height="520" style="border:none;" scrolling="no"/><script>document.getElementById('sa-lp-chart').src = '/images/sa-lp-returns.html?' + Date.now();</script>
Understanding the two proxies requires a brief detour into how 13F reports options. The [SEC's Form 13F FAQ](https://www.sec.gov/rules-regulations/staff-guidance/division-investment-management-frequently-asked-questions/frequently-asked-questions-about-form-13f) clarifies that for option rows most columns describe the _underlying security_, not the contract itself. In particular, the reported dollar value is the market value of the underlying shares that the options control, _not_ the premium the fund paid. A filing showing \\(N\\) of INTC calls is therefore roughly \\(N\\) of INTC exposure held through calls, not \\(N\\) of capital spent on call premiums.
The inflation this introduces is easier to see with numbers. Imagine a fund whose 13F reports three positions:
| Row | 13F value | 13F % |
|------------|-----------|-------|
| INTC stock | $100M | 50% |
| INTC calls | $50M | 25% |
| NVDA stock | $50M | 25% |
| Total | $200M | 100% |
The $50M on the calls is underlying notional, not premium. If the representative out-of-the-money 0.15-delta call trades at roughly 5% of spot, the premium actually paid on that position is around $2.5M. Recomputed on deployed capital instead, the picture shifts:
| Row | Capital | Capital % |
|------------|---------|-----------|
| INTC stock | $100M | 65.6% |
| INTC calls | $2.5M | 1.6% |
| NVDA stock | $50M | 32.8% |
| Total | $152.5M | 100% |
The INTC calls look like 25% of the portfolio in the 13F but consume only ~1.6% of the fund's deployed capital. The two modes handle this mismatch differently.
The _equity proxy_ mode is the conservative baseline. It converts every row into linear exposure to the underlying, sized by the reported dollar value: long stock for calls, short stock for puts. The denominator is the sum of reported values. This mode makes no assumption about the missing option details: it simply asks what the disclosed directional bets would have earned if executed as vanilla equities.
The _option proxy_ mode tries to preserve the option-like payoff shape. A call is not just levered stock and a put is not just a short: options can express a view about tail size, volatility, or downside capped at the premium. A low-delta call, for instance, pays off on a large move while risking only the premium—qualitatively different from a linear long.
Since, as noted above, the filings reveal the underlying but not the actual options contract, the proxy picks a deliberately narrow representative contract for each option row: same type (call or put), expiring 9–15 months out, with absolute delta closest to 0.15. This is not an estimate of the fund's actual strike or expiry. Rather, it is an attempt to preserve the qualitative thesis: out-of-the-money optionality and convex exposure to large moves.
With the contract fixed, the position is sized from the 13F underlying notional. If a filing reports \\(N\\) of INTC underlying notional and INTC starts the period at \\(S\_0\\), the proxy holds approximately \\(N / (100 \cdot S\_0)\\) contracts (one contract covers 100 shares). The contract's daily mid price is then pulled from [MarketData.app](https://www.marketdata.app/) and used to compute the period's option P&amp;L.[^fn:3]
The option proxy divides the period's total P&amp;L by the capital _actually deployed_ (stock market value plus option premium paid) not by the sum of reported 13F values. This is the denominator an investor would see on a brokerage statement: P&amp;L on money genuinely spent. Returning to the three-position example, suppose that over a period INTC stock gains 10% (+$10M), the INTC calls roughly double (+$2.5M), and NVDA gains 20% (+$10M), for a total P&amp;L of $22.5M. On deployed capital ($152.5M) that is a 14.75% return; on the 13F total ($200M) it would be 11.25%. The 11.25% figure incorrectly dilutes the return by treating the $47.5M gap between notional and premium as if it were cash sitting idle.
Sensitivity to contract choice {#sensitivity-to-contract-choice}
The option proxy has two free parameters that control which contract stands in for each disclosed option row: the target absolute delta (0.15 in the main backtest) and the expiry window (9–15 months). Because the fund's actual contracts are undisclosed, neither choice is constrained by evidence. The question is then how much the proxy's reported performance depends on these choices. If the cumulative return swings dramatically across plausible alternatives, the option proxy is really a family of proxies and the headline figure should be read as one point in a wide band. If it barely moves, the default choice is defensible.
The block below reruns the option proxy under (a) five different delta targets at the baseline 9–15 month expiry window, and (b) four different expiry windows at the baseline |delta|=0.15. The broad option chain for each (ticker, period) is fetched once and reused across combos, so the sweep beyond the single-point run mostly costs one quote-series fetch per (new-contract × ticker × period).<details><summary>Code</summary><div class="details"><a id="code-snippet--sa-sensitivity"/>
```python
import json
import yfinance as yf
import pandas as pd
from datetime import datetime, timedelta
import numpy as np
import requests
import time
import os
import warnings
warnings.filterwarnings('ignore')
Parse data from the scraper block
parsed = json.loads(data) if isinstance(data, str) else data
filings = parsed["filings"]
Build internal structures
filing_dates = {f["quarter"]: f["filing_date"] for f in filings}
quarter_end_dates = {f["quarter"]: f["quarter_end"] for f in filings}
quarters = [f["quarter"] for f in filings]
Convert holdings list to dict keyed by quarter.
Multiple positions in the same ticker with different types are aggregated
by value per (ticker, type) pair.
holdings = {}
for f in filings:
positions = {}
for h in f["holdings"]:
ticker = h["ticker"]
pos_type = h["type"]
value = h["value"]
key = (ticker, pos_type)
positions[key] = positions.get(key, 0) + value
holdings[f["quarter"]] = positions
def _extract_close_series(df, ticker):
"""Extract a single close-price series from a yfinance result."""
if df.empty:
return pd.Series(dtype=float)
if isinstance(df.columns, pd.MultiIndex):
if 'Close' not in df.columns.get_level_values(0):
return pd.Series(dtype=float)
close = df['Close']
if isinstance(close, pd.DataFrame):
if ticker in close.columns:
series = close[ticker]
elif len(close.columns) == 1:
series = close.iloc[:, 0]
else:
return pd.Series(dtype=float)
else:
series = close
elif 'Close' in df.columns:
series = df['Close']
if isinstance(series, pd.DataFrame):
series = series.iloc[:, 0]
else:
return pd.Series(dtype=float)
return pd.to_numeric(series, errors='coerce').dropna()
def _download_close_series(ticker, start, end):
"""Download one ticker's close series; used to repair flaky batch misses."""
df = yf.download(ticker, start=start, end=end, progress=False,
auto_adjust=True)
return _extract_close_series(df, ticker)
def get_prices(tickers, dates):
"""Fetch close prices for tickers on specific dates."""
unique_tickers = sorted(set(tickers))
all_dates = [datetime.strptime(d, '%Y-%m-%d') for d in dates]
start = min(all_dates) - timedelta(days=5)
end = max(all_dates) + timedelta(days=5)
df = yf.download(unique_tickers, start=start, end=end, progress=False, auto_adjust=True)
yf.download returns MultiIndex columns (metric, ticker) for multiple tickers
if df.empty:
close = pd.DataFrame()
elif isinstance(df.columns, pd.MultiIndex) and 'Close' in df.columns.get_level_values(0):
close = df['Close'].copy()
elif 'Close' in df.columns:
close = df[['Close']].copy()
close.columns = unique_tickers
else:
close = pd.DataFrame()
prices = {}
for ticker in unique_tickers:
if ticker in close.columns:
series = pd.to_numeric(close[ticker], errors='coerce').dropna()
else:
series = pd.Series(dtype=float)
if series.empty:
series = _download_close_series(ticker, start, end)
if series.empty:
continue
prices[ticker] = {}
for date_str in dates:
target = pd.Timestamp(datetime.strptime(date_str, '%Y-%m-%d'))
after = series[series.index >= target]
if not after.empty:
prices[ticker][date_str] = float(after.iloc[0])
else:
before = series[series.index<= target]= if= not= before.empty:= prices[ticker][date_str]=float(before.iloc[-1]) return= prices= def= _price_on_or_after(px_by_date,= target_date):= """Return= (date,= price)= for= the= first= available= price= on/after= target."""= if= not= px_by_date:= return= None= dates=sorted(d for= d= in= px_by_date= if= d=>= target_date)
if not dates:
return None
d = dates[0]
return d, px_by_date[d]
def _price_on_or_before(px_by_date, target_date):
"""Return (date, price) for the last available price on/before target."""
if not px_by_date:
return None
dates = sorted(d for d in px_by_date if d<= target_date)= if= not= dates:= return= None= d=dates[-1] return= d,= px_by_date[d]= def= _period_price_pair(px_by_date,= start_date,= end_date):= """Return= start/end= prices= for= a= period= using= sensible= boundary= alignment."""= start=_price_on_or_after(px_by_date, start_date)= end=_price_on_or_before(px_by_date, end_date)= if= start= is= None= or= end= is= None:= return= None= start_actual,= p0=start end_actual,= p1=end if= end_actual= <= start_actual:= return= None= return= start_actual,= end_actual,= p0,= p1= def= _option_position_key(ticker,= pos_type):= return= (ticker,= pos_type)= def= _linear_underlying_sign(pos_type):= """Direction= when= option= rows= are= converted= to= underlying= equity= exposure."""= return= -1= if= pos_type== 'put'= else= 1= def= compute_return(positions,= prices,= start_date,= end_date,= mode='equity_only' ,= option_prices=None): """Compute= portfolio= return= between= two= dates.= The= 13F= value= for= an= option= row= is= treated= as= underlying= notional,= not= option= premium.= Option= contracts= are= sized= from= that= notional,= but= the= portfolio= denominator= is= estimated= deployed= capital:= stock= value= plus= option= premium= cost.= This= avoids= treating= the= gap= between= option= notional= and= option= premium= as= cash.= In= 'full'= mode,= every= option= row= requires= a= MarketData= price= series;= missing= data= raises= rather= than= falling= back.= """= total_cost=0 portfolio_pnl=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= stock_px=prices.get(ticker) if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= total_cost= +=value portfolio_pnl= +=value *= _linear_underlying_sign(pos_type)= *= stock_ret= continue= if= is_option:= opt_key=_option_position_key(ticker, pos_type)= opt_px=option_prices.get(opt_key) if= option_prices= else= None= if= not= opt_px:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= period= "= f"{start_date}..{end_date}")= pair=_period_price_pair(opt_px, start_date,= end_date)= if= pair= is= None:= raise= RuntimeError(= f"MarketData= option= price= series= for= {opt_key}= does= not= "= f"cover= {start_date}..{end_date}")= start_actual,= end_actual,= opt_p0,= opt_p1=pair stock_start=_price_on_or_after(stock_px, start_actual)= if= stock_start= is= None= or= stock_start[1]= <=0: stock_start=_price_on_or_after(stock_px, start_date)= if= stock_start= is= None= or= stock_start[1]= <=0: raise= RuntimeError(= f"No= underlying= price= for= {ticker}= at= {start_date}")= p0,= p1=opt_p0, opt_p1= underlying_p0=stock_start[1] if= p0= <=0 or= underlying_p0= <=0: continue= position_cost=value *= (p0= /= underlying_p0)= position_pnl=value *= ((p1= -= p0)= /= underlying_p0)= else:= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= position_cost=value position_pnl=value *= stock_ret= if= position_cost= <=0: continue= total_cost= +=position_cost portfolio_pnl= +=position_pnl return= portfolio_pnl= /= total_cost= if= total_cost= else= None= def= annualize(ret,= days):= """Annualize= a= return= over= a= given= number= of= calendar= days."""= if= ret= is= None= or= days= <=0: return= None= return= (1= += ret)= **= (365.25= /= days)= -= 1= def= fmt(ret):= return= f"{ret= *= 100:+.2f}%"= if= ret= is= not= None= else= "N/A"= Collect= all= tickers= and= dates= all_tickers=set() for= positions= in= holdings.values():= for= (ticker,= _)= in= positions:= all_tickers.add(ticker)= all_tickers.add('SPY')= today=datetime.now().strftime('%Y-%m-%d') first_date=filing_dates[quarters[0]] all_dates=set(filing_dates.values()) |= set(quarter_end_dates.values())= |= {today}= prices=get_prices(sorted(all_tickers), sorted(all_dates))= Resolve= `today`= to= the= actual= last= available= closing= date.= yfinance= may= not= have= data= for= today= (market= still= open= or= holiday),= so= we= look= up= what= date= SPY's= price= actually= corresponds= to.= def= _resolve_price_date(prices,= requested_date):= """Return= the= actual= trading= date= of= the= price= stored= under= requested_date."""= ref='SPY' if= 'SPY'= in= prices= else= next(iter(prices),= None)= if= not= ref= or= requested_date= not= in= prices[ref]:= return= requested_date= target_price=prices[ref][requested_date] Re-download= a= small= window= to= find= the= real= date= of= this= price= start=datetime.strptime(requested_date, '%Y-%m-%d')= -= timedelta(days=10) end=datetime.strptime(requested_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(ref, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= return= requested_date= if= isinstance(df.columns,= pd.MultiIndex):= close=df['Close'][ref].dropna() elif= 'Close'= in= df.columns:= close=df['Close'].dropna() else:= close=df.iloc[:, 0].dropna()= for= dt,= px= in= close.items():= val=float(px.iloc[0]) if= isinstance(px,= pd.Series)= else= float(px)= if= abs(val= -= target_price)= <= 0.01:= ts=dt[0] if= isinstance(dt,= tuple)= else= dt= return= pd.Timestamp(ts).strftime('%Y-%m-%d')= return= requested_date= today_resolved=_resolve_price_date(prices, today)= if= today_resolved= !=today: for= ticker= in= prices:= if= today= in= prices[ticker]:= prices[ticker][today_resolved]=prices[ticker].pop(today) today=today_resolved def= download_daily(tickers,= start_date,= end_date):= """Download= daily= close= prices= from= yfinance,= handling= MultiIndex.= Dates= are= 'YYYY-MM-DD'= strings.= Adds= a= small= buffer= for= trading-day= alignment."""= tickers_sorted=sorted(tickers) start=datetime.strptime(start_date, '%Y-%m-%d')= -= timedelta(days=5) end=datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(tickers_sorted, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= close=pd.DataFrame() elif= isinstance(df.columns,= pd.MultiIndex)= and= 'Close'= in= df.columns.get_level_values(0):= close=df['Close'].copy() elif= 'Close'= in= df.columns:= close=df[['Close']].copy() close.columns=tickers_sorted else:= close=pd.DataFrame() for= ticker= in= tickers_sorted:= if= ticker= in= close.columns= and= not= close[ticker].dropna().empty:= continue= series=_download_close_series(ticker, start,= end)= if= not= series.empty:= close[ticker]=series return= close.sort_index()= --= Historical= option= prices= via= MarketData= --------------------------------= OPTION_CACHE_DIR=os.path.expanduser('~/My Drive/notes/.sa-lp-option-cache')= _MD_BASE='https://api.marketdata.app/v1' _MD_RATE_DELAY=0.15 OPTION_CACHE_COLUMNS=[ 'date',= 'selected_on',= 'option_type',= 'symbol',= 'strike',= 'expiry',= 'delta',= 'price']= Default= contract= selection= parameters.= The= option= proxy= picks= a= contract= matching= option= type,= with= expiry= between= min_days= and= max_days= of= the= period= start,= and= |delta|= closest= to= delta_target.= When= the= chain= is= sparse,= the= achieved= |delta|= may= be= far= from= the= target;= the= sensitivity= block= reports= achieved= |delta|= so= this= is= visible= rather= than= silent.= OPTION_DELTA=0.15 EXPIRY_MIN_DAYS=270 #= ~9= months= EXPIRY_MAX_DAYS=456 #= ~15= months= def= _normalize_option_type(option_type):= option_type=str(option_type).lower() if= option_type= not= in= ('call',= 'put'):= raise= ValueError(f"Unsupported= option= type:= {option_type}")= return= option_type= def= _empty_option_cache():= return= pd.DataFrame(columns=OPTION_CACHE_COLUMNS) def= _option_cache_path(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Return= the= cache= CSV= path= for= (ticker,= type,= delta_target,= window).= When= the= parameter= triple= equals= the= baseline= (0.15,= 270-456= days),= the= historical= filename= ``TICKER-TYPE.csv``= is= used= so= the= main-backtest= cache= is= reused= automatically.= Any= non-baseline= combo= lives= in= a= separate= ``TICKER-TYPE-d<delta=>-e<min>-<max>.csv`` file so a sensitivity
sweep never pollutes the baseline cache (which the portfolio calculator
reads to pick the representative contract for the current filing).
"""
option_type = _normalize_option_type(option_type)
is_baseline = (
abs(delta_target - OPTION_DELTA)< 1e-9= and= min_days== EXPIRY_MIN_DAYS= and= max_days== EXPIRY_MAX_DAYS)= if= is_baseline:= return= os.path.join(OPTION_CACHE_DIR,= f'{ticker}-{option_type}.csv')= return= os.path.join(= OPTION_CACHE_DIR,= f'{ticker}-{option_type}-d{delta_target:g}-e{min_days}-{max_days}.csv')= def= _load_option_cache(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Load= cached= MarketData= rows= for= a= ticker/type/target/window.= Returns= DataFrame= or= empty."""= option_type=_normalize_option_type(option_type) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= not= os.path.exists(path):= return= _empty_option_cache()= df=pd.read_csv(path) if= df.empty:= return= _empty_option_cache()= for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan for= col= in= ('date',= 'selected_on'):= df[col]=pd.to_datetime( df[col],= errors='coerce' ).dt.strftime('%Y-%m-%d')= df['option_type']=df['option_type'].fillna(option_type).str.lower() cache=df[OPTION_CACHE_COLUMNS].copy() cache=cache[cache['option_type'] == option_type].copy()= cache.dropna(subset=['date'], inplace=True) for= col= in= ('strike',= 'delta',= 'price'):= cache[col]=pd.to_numeric(cache[col], errors='coerce' )= cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) return= cache[OPTION_CACHE_COLUMNS]= def= _save_option_cache(ticker,= option_type,= df,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Persist= typed= option= cache= to= CSV."""= option_type=_normalize_option_type(option_type) os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= df.empty:= df=_empty_option_cache() else:= df=df.copy() df['option_type']=option_type for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan df.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) df.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) df.to_csv(path,= index=False) def= _contract_window(ref_date_str,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= return= ref= += timedelta(days=min_days), ref= += timedelta(days=max_days) def= _contract_from_cache_row(row,= ref_date_str,= option_type,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) if= str(row.get('option_type',= option_type)).lower()= !=option_type: return= None= lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= try:= exp=datetime.strptime(str(row['expiry']), '%Y-%m-%d')= except= (KeyError,= TypeError,= ValueError):= return= None= if= not= (lo= <=exp <=hi): return= None= strike=_safe_float(row.get('strike')) delta=_safe_float(row.get('delta')) price=_safe_float(row.get('price')) if= strike= is= None= or= delta= is= None= or= price= is= None= or= price= <=0: return= None= return= {= 'selected_on':= row.get('selected_on'),= 'option_type':= option_type,= 'symbol':= row.get('symbol'),= 'strike':= strike,= 'expiry':= str(row['expiry']),= 'delta':= delta,= 'price':= price,= }= def= _select_cached_contract(cache,= option_type,= ref_date_str,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS, require_selected=False): rows=cache[(cache['date'] == ref_date_str)= &= (cache['option_type']== option_type)]= selected_rows=rows[rows['selected_on'] == ref_date_str]= if= not= selected_rows.empty:= rows=selected_rows elif= require_selected:= rows=selected_rows candidates=[] for= _,= row= in= rows.iterrows():= contract=_contract_from_cache_row(row, ref_date_str,= option_type,= min_days,= max_days)= if= contract:= candidates.append(contract)= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= _parse_option_price(contract):= """Extract= a= mark= price= from= an= option= contract= record."""= mid=_safe_float(contract.get('mid')) if= mid= and= mid=> 0:
return mid
bid = _safe_float(contract.get('bid'))
ask = _safe_float(contract.get('ask'))
last = _safe_float(contract.get('last'))
if bid and ask and bid > 0 and ask > 0:
return (bid + ask) / 2
if last and last > 0:
return last
return None
def _safe_float(val):
try:
out = float(val)
if np.isnan(out):
return None
return out
except (TypeError, ValueError):
return None
def _marketdata_key():
"""Return the MarketData API key, or None if unavailable.
Resolution order:
1. ``MARKETDATA_KEY`` / ``MARKETDATA_API_KEY`` environment variables.
2. ``pass env/marketdata-token`` (local ``pass`` store).
The result is memoised on the function object so repeated lookups
during a sweep do not reshell. Fetch helpers raise themselves when
called without a key, so a fully cached run still succeeds without
requiring either source.
"""
if hasattr(_marketdata_key, '_cached'):
return _marketdata_key._cached
key = (os.environ.get('MARKETDATA_KEY', '')
or os.environ.get('MARKETDATA_API_KEY', ''))
if not key:
try:
import subprocess
out = subprocess.run(
['pass', 'show', 'env/marketdata-token'],
capture_output=True, text=True, timeout=5, check=False)
if out.returncode == 0:
key = out.stdout.strip().splitlines()[0] if out.stdout else ''
except (FileNotFoundError, subprocess.TimeoutExpired):
key = ''
_marketdata_key._cached = key or None
return _marketdata_key._cached
def _marketdata_get(path, params, api_key):
"""Fetch a MarketData endpoint, returning normalized row dictionaries.
Raises on HTTP errors or a non-'ok' status. 'no_data' is returned as
an empty list so that callers can distinguish 'nothing available' from
'request failed'.
"""
headers = {'Accept': 'application/json', 'Authorization': f'Bearer {api_key}'}
resp = requests.get(_MD_BASE + path, params=params, headers=headers,
timeout=30)
resp.raise_for_status()
body = resp.json()
status = body.get('s')
if status == 'no_data':
return []
if status != 'ok':
raise RuntimeError(
f"MarketData {path} returned status={status!r}: "
f"{body.get('errmsg') or body}")
lengths = [len(v) for v in body.values() if isinstance(v, list)]
n = max(lengths) if lengths else 0
rows = []
for i in range(n):
row = {}
for key, val in body.items():
if isinstance(val, list):
row[key] = val[i] if i< len(val)= else= None= else:= row[key]=val rows.append(row)= return= rows= def= _marketdata_date(timestamp):= try:= return= datetime.utcfromtimestamp(int(timestamp)).strftime('%Y-%m-%d')= except= (TypeError,= ValueError,= OSError):= return= None= def= _occ_symbol(ticker,= option_type,= strike,= expiry):= """Build= a= standard= OCC= option= symbol= from= contract= fields."""= cp='C' if= _normalize_option_type(option_type)== 'call'= else= 'P'= exp=datetime.strptime(str(expiry), '%Y-%m-%d').strftime('%y%m%d')= strike_int=int(round(float(strike) *= 1000))= root=ticker.upper().replace('.', '')= return= f'{root}{exp}{cp}{strike_int:08d}'= Chains= are= always= fetched= with= a= broad= expiry= window= so= they= can= be= cached= and= reused= for= in-memory= selection= across= any= (delta_target,= expiry= window)= combination= in= the= sensitivity= sweep.= CHAIN_FETCH_MIN_DAYS=30 CHAIN_FETCH_MAX_DAYS=760 def= _fetch_marketdata_chain(ticker,= date_str,= option_type,= api_key,= min_days=CHAIN_FETCH_MIN_DAYS, max_days=CHAIN_FETCH_MAX_DAYS): lo,= hi=_contract_window(date_str, min_days,= max_days)= params={ 'date':= date_str,= 'from':= lo.strftime('%Y-%m-%d'),= 'to':= hi.strftime('%Y-%m-%d'),= 'side':= _normalize_option_type(option_type),= 'expiration':= 'all',= }= return= _marketdata_get(f'/options/chain/{ticker}/',= params,= api_key)= Chain= cache:= one= CSV= per= (ticker,= type,= date)= storing= the= broad-window= chain.= Lets= the= sensitivity= sweep= re-select= contracts= for= different= delta= targets= and= expiry= windows= without= refetching.= CHAIN_CACHE_DIR=os.path.join(OPTION_CACHE_DIR, 'chains')= def= _chain_cache_path(ticker,= option_type,= date_str):= option_type=_normalize_option_type(option_type) return= os.path.join(CHAIN_CACHE_DIR,= f'{ticker}-{option_type}-{date_str}.csv')= def= _load_chain_cache(ticker,= option_type,= date_str):= path=_chain_cache_path(ticker, option_type,= date_str)= if= not= os.path.exists(path):= return= None= df=pd.read_csv(path) if= df.empty:= return= []= return= df.to_dict('records')= def= _save_chain_cache(ticker,= option_type,= date_str,= chain):= if= not= chain:= return= os.makedirs(CHAIN_CACHE_DIR,= exist_ok=True) path=_chain_cache_path(ticker, option_type,= date_str)= pd.DataFrame(chain).to_csv(path,= index=False) def= _get_or_fetch_chain(ticker,= date_str,= option_type,= api_key,= fetched_counter=None): """Return= the= cached= broad= chain= for= (ticker,= type,= date),= fetching= if= absent.= Requires= ``api_key``= only= when= a= fetch= is= actually= needed.= """= chain=_load_chain_cache(ticker, option_type,= date_str)= if= chain= is= not= None:= return= chain= if= not= api_key:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= but= a= chain= fetch= is= required= for= "= f"{ticker}= {option_type}= on= {date_str}.")= time.sleep(_MD_RATE_DELAY)= chain=_fetch_marketdata_chain(ticker, date_str,= option_type,= api_key)= if= fetched_counter= is= not= None:= fetched_counter['marketdata_chains']= +=1 _save_chain_cache(ticker,= option_type,= date_str,= chain)= return= chain= def= _fetch_marketdata_quotes(symbol,= start_date,= end_date,= api_key):= to_date=(datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=1)).strftime('%Y-%m-%d') rows=_marketdata_get(f'/options/quotes/{symbol}/', {'from':= start_date,= 'to':= to_date},= api_key)= prices={} for= row= in= rows:= date_str=_marketdata_date(row.get('updated')) if= not= date_str:= continue= price=_parse_option_price(row) if= price= is= not= None= and= price=> 0:
prices[date_str] = price
return prices
def _implied_vol_from_price(S, K, T, option_price, option_type):
"""Infer Black-Scholes volatility from an observed option mid price."""
if any(x is None for x in (S, K, T, option_price)):
return None
if S<= 0= or= K= <=0 or= T= <=0 or= option_price= <=0: return= None= intrinsic=max(S -= K,= 0)= if= option_type== 'call'= else= max(K= -= S,= 0)= upper=S if= option_type== 'call'= else= K= if= option_price= <= intrinsic= -= 1e-6= or= option_price=> upper * 1.5:
return None
lo, hi = 1e-4, 5.0
try:
if (option_price< bs_price(S,= K,= T,= lo,= option_type)= -= 1e-4= or= option_price=> bs_price(S, K, T, hi, option_type) + 1e-4):
return None
for _ in range(80):
mid = (lo + hi) / 2
if bs_price(S, K, T, mid, option_type)< option_price:= lo=mid else:= hi=mid return= (lo= += hi)= /= 2= except= (FloatingPointError,= ValueError,= ZeroDivisionError):= return= None= def= _marketdata_delta(row,= ref_date_str,= expiry,= option_type,= price):= """Use= vendor= delta= when= present;= otherwise= infer= it= from= the= quote."""= native=_safe_float(row.get('delta')) if= native= is= not= None= and= native= !=0: return= native= S=_safe_float(row.get('underlyingPrice')) K=_safe_float(row.get('strike')) ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= exp=datetime.strptime(expiry, '%Y-%m-%d')= T=max((exp -= ref).days= /= 365.25,= 1e-6)= sigma=_safe_float(row.get('iv')) if= sigma= is= None= or= sigma= <=0: sigma=_implied_vol_from_price(S, K,= T,= price,= option_type)= if= S= is= None= or= K= is= None= or= sigma= is= None= or= sigma= <=0: return= None= return= bs_delta(S,= K,= T,= sigma,= option_type)= def= _select_marketdata_contract(chain,= ref_date_str,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= candidates=[] for= c= in= chain:= if= str(c.get('side',= '')).lower()= !=option_type: continue= expiry=_marketdata_date(c.get('expiration')) if= not= expiry:= continue= exp=datetime.strptime(expiry, '%Y-%m-%d')= if= not= (lo= <=exp <=hi): continue= price=_parse_option_price(c) if= price= is= None= or= price= <=0: continue= delta=_marketdata_delta(c, ref_date_str,= expiry,= option_type,= price)= if= delta= is= None= or= delta== 0:= continue= strike=_safe_float(c.get('strike')) symbol=c.get('optionSymbol') if= strike= is= None= or= not= symbol:= continue= candidates.append({= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= delta,= 'price':= price,= })= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= download_option_prices(option_positions,= quarters,= holdings,= filing_dates,= today,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Download= historical= representative= option= prices= from= MarketData.= MarketData= is= the= sole= supported= provider.= MARKETDATA_KEY= must= be= set.= For= each= (ticker,= option_type)= and= each= filing= period= in= which= that= position= is= held:= 1.= On= the= first= trading= day,= select= a= contract= matching= type,= with= expiry= between= ``min_days``= and= ``max_days``= of= the= period= start,= and= |delta|= closest= to= ``delta_target``.= MarketData's= Starter= plan= often= returns= null= Greeks,= so= delta= is= inferred= from= the= observed= mid= price= via= Black-Scholes= when= the= vendor= delta= is= missing.= 2.= Lock= in= that= contract= for= the= period.= 3.= Track= its= historical= mid= price= through= the= period.= The= broad= option= chain= for= each= (ticker,= type,= first_day)= is= cached= to= disk= so= that= sensitivity= sweeps= over= (delta_target,= expiry= window)= reuse= a= single= fetch.= Raises= ``RuntimeError``= if= no= suitable= contract= can= be= selected= for= any= required= (ticker,= type,= period),= or= if= MarketData= returns= no= price= series= for= the= selected= contract.= Parameters= ----------= delta_target= := float= Target= |delta|= for= contract= selection= (default= ``OPTION_DELTA``).= min_days,= max_days= := int= Contract= expiry= window= in= days= from= period= start= (default= 270-456,= i.e.= 9-15= months).= Returns= -------= per_period= := dict= {quarter_str:= {(ticker,= type):= {date_str:= float}}}= Option= prices= keyed= by= filing= period= then= option= position.= Each= period= has= its= own= contract's= prices.= """= option_positions=sorted({ (ticker,= _normalize_option_type(pos_type))= for= ticker,= pos_type= in= option_positions})= md_key=_marketdata_key() os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) per_period={} #= {q:= {(ticker,= type):= {date_str:= price}}}= fetched={'marketdata_chains': 0,= 'marketdata_quotes':= 0}= for= ticker,= option_type= in= option_positions:= opt_key=_option_position_key(ticker, option_type)= cache=_load_option_cache(ticker, option_type,= delta_target,= min_days,= max_days)= new_rows=[] for= i,= q= in= enumerate(quarters):= Skip= quarters= where= this= exact= option= position= is= absent.= if= opt_key= not= in= holdings[q]:= continue= period_start=filing_dates[q] period_end=(filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today)= trading_days=pd.bdate_range(period_start, period_end)= if= len(trading_days)== 0:= continue= first_day=trading_days[0].strftime('%Y-%m-%d') --= Select= contract= on= first= trading= day= --= contract=_select_cached_contract( cache,= option_type,= first_day,= delta_target=delta_target, min_days=min_days, max_days=max_days, require_selected=True) if= contract= is= None:= chain=_get_or_fetch_chain( ticker,= first_day,= option_type,= md_key,= fetched)= contract=_select_marketdata_contract( chain,= first_day,= option_type,= delta_target=delta_target, min_days=min_days, max_days=max_days) if= contract= is= None:= raise= RuntimeError(= f"MarketData= returned= no= usable= {option_type}= contract= "= f"for= {ticker}= on= {first_day}= (period= {q})= at= "= f"delta={delta_target}, "= f"expiry= {min_days}-{max_days}d")= new_rows.append({= 'date':= first_day,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= contract.get('symbol'),= 'strike':= contract['strike'],= 'expiry':= contract['expiry'],= 'delta':= contract['delta'],= 'price':= contract['price'],= })= strike=contract['strike'] expiry=contract['expiry'] symbol=contract.get('symbol') or= _occ_symbol(= ticker,= option_type,= strike,= expiry)= --= Collect= prices= for= this= period= (fresh= dict= per= period)= --= period_prices={} Fast= path:= read= matching= prices= from= cache.= rows=cache[ (cache['date']=>= period_start)
& (cache['date']<= period_end)= &= (cache['option_type']== option_type)= &= (abs(cache['strike']= -= strike)= <= 0.01)= &= (cache['expiry'].astype(str)== str(expiry))= &= pd.notna(cache['price'])]= selected_rows=rows[rows['selected_on'] == first_day]= if= not= selected_rows.empty:= rows=selected_rows for= _,= row= in= rows.iterrows():= period_prices[row['date']]=float(row['price']) Decide= whether= to= refresh= quotes.= With= a= key,= refresh= whenever= the= cached= series= does= not= reach= period_end.= Without= a= key,= only= fail= if= the= cached= series= is= empty;= a= slightly= stale= tail= is= acceptable= for= cache-only= runs= (e.g.= sensitivity= sweeps= replaying= the= baseline= contract).= has_partial=bool(period_prices) reaches_end=has_partial and= max(period_prices)=>= period_end
if md_key and not reaches_end:
time.sleep(_MD_RATE_DELAY)
quote_prices = _fetch_marketdata_quotes(
symbol, period_start, period_end, md_key)
fetched['marketdata_quotes'] += 1
for day_str, price in quote_prices.items():
if period_start<= day_str= <=period_end: period_prices[day_str]=price new_rows.append({= 'date':= day_str,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= contract['delta'],= 'price':= price,= })= if= contract.get('price')= and= first_day= not= in= period_prices:= period_prices[first_day]=contract['price'] elif= not= md_key= and= not= has_partial:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= and= no= cached= quotes= exist= "= f"for= {symbol}= in= {period_start}..{period_end}.")= if= not= period_prices:= raise= RuntimeError(= f"MarketData= returned= no= quotes= for= {symbol}= "= f"({opt_key})= in= {period_start}..{period_end}")= per_period.setdefault(q,= {})[opt_key]=period_prices Persist= new= data= to= cache= if= new_rows:= new_df=pd.DataFrame(new_rows) cache=pd.concat([cache, new_df],= ignore_index=True) cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) _save_option_cache(ticker,= option_type,= cache,= delta_target,= min_days,= max_days)= if= any(fetched.values()):= import= sys= parts=[] if= fetched['marketdata_chains']:= parts.append(f"{fetched['marketdata_chains']}= MarketData= chains")= if= fetched['marketdata_quotes']:= parts.append(f"{fetched['marketdata_quotes']}= MarketData= quote= series")= print(f"[options]= Fetched= {',= '.join(parts)}",= file=sys.stderr) return= per_period= --= Black-Scholes= helpers= (used= only= to= infer= delta= when= MarketData's= Starter-plan= historical= Greeks= are= null;= never= to= reprice= returns)= -----= from= scipy.stats= import= norm= as= _norm= def= bs_price(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= option= price= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= max(S= -= K,= 0)= return= max(K= -= S,= 0)= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= d2=d1 -= sigma= *= np.sqrt(T)= if= option_type== 'call':= return= S= *= _norm.cdf(d1)= -= K= *= _norm.cdf(d2)= return= K= *= _norm.cdf(-d2)= -= S= *= _norm.cdf(-d1)= def= bs_delta(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= delta= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= 1.0= if= S=> K else 0.0
return -1.0 if S< K= else= 0.0= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= if= option_type== 'call':= return= _norm.cdf(d1)= return= _norm.cdf(d1)= -= 1= def= daily_cumulative(holdings,= quarters,= filing_dates,= close,= today,= mode,= per_period_opt=None): """Build= a= daily= series= of= cumulative= growth= factors= for= a= given= mode.= For= each= filing= period,= stock= shares= and= option= contracts= are= fixed.= In= equity-proxy= mode,= option= rows= are= converted= to= linear= underlying= exposure:= calls= are= long= underlying= and= puts= are= short= underlying.= In= option-proxy= mode,= option= rows= are= sized= by= 13F= underlying= notional= and= returns= come= from= MarketData= quotes;= returns= are= divided= by= deployed= capital= (stock= value= plus= option= premium= cost).= Option-proxy= mode= raises= if= MarketData= prices= are= missing= for= any= required= position.= """= cum_growth=1.0 dates_out=[] values_out=[] for= i,= q= in= enumerate(quarters):= period_start=filing_dates[q] period_end=filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today= ps=pd.Timestamp(period_start) pe=pd.Timestamp(period_end) Trading= days= in= this= period= mask=(close.index>= ps) & (close.index<= pe)= period_close=close[mask] if= period_close.empty:= continue= Option= prices= for= this= period= (keyed= by= (ticker,= type)= →= prices)= quarter_opt=per_period_opt.get(q, {})= if= per_period_opt= else= {}= Determine= starting= prices,= fixed= exposure,= and= deployed= capital.= positions=holdings[q] exposure={} costs={} start_prices={} start_underlying={} use_opt_px={} #= track= which= positions= use= option= prices= total_cost=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= opt_key=_option_position_key(ticker, pos_type)= if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value continue= mode== 'full'= (option= proxy)= if= is_option:= if= opt_key= not= in= quarter_opt:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= "= f"period= {q}")= ticker_opt=quarter_opt[opt_key] opt_dates=sorted(d for= d= in= ticker_opt= if= d=>= period_start)
if not opt_dates:
raise RuntimeError(
f"MarketData option prices for {opt_key} in period "
f"{q} contain no dates at or after {period_start}")
if ticker not in close.columns:
raise RuntimeError(
f"No underlying close series for {ticker}")
src = close[ticker].dropna()
avail = src[src.index >= ps]
if avail.empty:
raise RuntimeError(
f"No underlying price for {ticker} at {period_start}")
opt_start = ticker_opt[opt_dates[0]]
underlying_start = float(avail.iloc[0])
if opt_start<= 0= or= underlying_start= <=0: raise= RuntimeError(= f"Non-positive= starting= price= for= {opt_key}= in= "= f"period= {q}")= start_prices[(ticker,= pos_type)]=opt_start start_underlying[(ticker,= pos_type)]=underlying_start costs[(ticker,= pos_type)]=value *= opt_start= /= underlying_start= exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=True total_cost= +=costs[(ticker, pos_type)]= continue= Plain= stock= in= full= mode= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value if= total_cost== 0:= continue= Daily= P&L= relative= to= period= start.= Skip= first= day= of= subsequent= periods= (already= recorded= as= last= day= of= the= prior= period)= to= avoid= duplicate= boundary= dates.= start_idx=1 if= i=> 0 else 0
Forward-fill: track last known option price so that gaps in
option data don't cause positions to vanish mid-period.
last_opt = {k: v for k, v in start_prices.items()
if use_opt_px.get(k)}
for day_idx in range(start_idx, len(period_close)):
day = period_close.index[day_idx]
day_str = day.strftime('%Y-%m-%d')
period_pnl = 0
for (ticker, pos_type), value in exposure.items():
p0 = start_prices[(ticker, pos_type)]
if p0 == 0:
continue
if use_opt_px[(ticker, pos_type)]:
opt_key = _option_position_key(ticker, pos_type)
p1_val = quarter_opt.get(opt_key, {}).get(day_str)
if p1_val is not None:
last_opt[(ticker, pos_type)] = p1_val
else:
p1_val = last_opt.get((ticker, pos_type))
if p1_val is None:
continue
underlying_p0 = start_underlying.get((ticker, pos_type))
if not underlying_p0 or underlying_p0<= 0:= continue= position_pnl=value *= (float(p1_val)= -= p0)= /= underlying_p0= else:= if= ticker= not= in= period_close.columns:= continue= p1_val=period_close[ticker].iloc[day_idx] if= pd.isna(p1_val):= continue= stock_ret=(float(p1_val) -= p0)= /= p0= if= mode== 'equity_only':= position_pnl=( value= *= _linear_underlying_sign(pos_type)= *= stock_ret)= else:= position_pnl=value *= stock_ret= period_pnl= +=position_pnl dates_out.append(day)= values_out.append(cum_growth= *= (1= += period_pnl= /= total_cost))= Chain:= next= period= starts= from= the= last= day's= growth= factor= if= values_out:= cum_growth=values_out[-1] return= dates_out,= values_out= --= Setup= (shared= across= combos)= ------------------------------------= option_positions=sorted({ (t,= pt)= for= q= in= quarters= for= (t,= pt)= in= holdings[q]= if= pt= in= ('call',= 'put')= })= daily_close=download_daily(all_tickers, first_date,= today)= def= _sharpe(daily_rets,= rf_annual=0.04): if= daily_rets.empty:= return= float('nan')= rf_daily=(1 += rf_annual)= **= (1= /= 252)= -= 1= excess=daily_rets -= rf_daily= if= excess.std()== 0= or= pd.isna(excess.std()):= return= float('nan')= return= float(excess.mean()= /= excess.std()= *= 252= **= 0.5)= def= _max_drawdown(daily_rets):= if= daily_rets.empty:= return= float('nan')= cum=(1 += daily_rets).cumprod()= return= float(((cum= -= cum.cummax())= /= cum.cummax()).min()= *= 100)= def= _mean_achieved_delta(delta_target,= min_days,= max_days):= """Mean= |delta|= of= contracts= selected= under= a= given= (target,= window),= across= all= (ticker,= type,= period)= positions= in= that= sweep's= cache."""= deltas=[] for= q= in= quarters:= for= (ticker,= pos_type)= in= holdings[q]:= if= pos_type= not= in= ('call',= 'put'):= continue= cache=_load_option_cache(ticker, pos_type,= delta_target,= min_days,= max_days)= if= cache.empty:= continue= rows=cache[(cache['selected_on'] == filing_dates[q])= &= (cache['option_type']== pos_type)]= if= rows.empty:= continue= d=rows.iloc[0]['delta'] if= pd.notna(d):= deltas.append(abs(float(d)))= if= not= deltas:= return= float('nan')= return= sum(deltas)= /= len(deltas)= def= backtest_combo(delta_target,= min_days,= max_days):= """Run= the= option-proxy= backtest= under= one= contract-selection= rule."""= per_period_opt=download_option_prices( option_positions,= quarters,= holdings,= filing_dates,= today,= delta_target=delta_target, min_days=min_days, max_days=max_days) cum=1.0 for= i,= q= in= enumerate(quarters):= start=filing_dates[q] end=filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today= ret=compute_return(holdings[q], prices,= start,= end,= 'full',= option_prices=per_period_opt.get(q, {}))= if= ret= is= not= None:= cum= *=(1 += ret)= achieved=_mean_achieved_delta(delta_target, min_days,= max_days)= if= daily_close.empty:= return= {'cum_ret':= cum= -= 1,= 'vol':= float('nan'),= 'sharpe':= float('nan'),= 'max_dd':= float('nan'),= 'achieved':= achieved}= dates,= values=daily_cumulative( holdings,= quarters,= filing_dates,= daily_close,= today,= 'full',= per_period_opt=per_period_opt) if= not= dates:= return= {'cum_ret':= cum= -= 1,= 'vol':= float('nan'),= 'sharpe':= float('nan'),= 'max_dd':= float('nan'),= 'achieved':= achieved}= growth=pd.Series(values, index=dates) daily_rets=growth.pct_change().dropna() return= {= 'cum_ret':= cum= -= 1,= 'vol':= float(daily_rets.std()= *= 252= **= 0.5= *= 100),= 'sharpe':= _sharpe(daily_rets),= 'max_dd':= _max_drawdown(daily_rets),= 'achieved':= achieved,= }= def= _safe_combo(label,= delta_target,= min_days,= max_days):= try:= return= backtest_combo(delta_target,= min_days,= max_days)= except= RuntimeError= as= e:= import= sys= print(f"[{label}]= {e}",= file=sys.stderr) return= {'cum_ret':= None,= 'vol':= float('nan'),= 'sharpe':= float('nan'),= 'max_dd':= float('nan'),= 'achieved':= float('nan'),= 'error':= str(e)}= def= _print_row(label,= r,= baseline):= flag=" *" if= baseline= else= "= "= cum=fmt(r['cum_ret']) if= r['cum_ret']= is= not= None= else= "err"= vol=f"{r['vol']:>8.1f}%" if r['vol'] == r['vol'] else " N/A"
sh = f"{r['sharpe']:>9.2f}" if r['sharpe'] == r['sharpe'] else " N/A"
mdd = f"{r['max_dd']:>8.1f}%" if r['max_dd'] == r['max_dd'] else " N/A"
ach = f"{r['achieved']:>7.2f}" if r.get('achieved') == r.get('achieved') else " N/A"
print(f"{label:<10}{flag}{cum:>10} {vol:>10} {sh:>10} {mdd:>10} {ach:>8}")
-- Delta sweep (9-15 month expiry) ---------------------------------
DELTAS = [0.10, 0.15, 0.25, 0.40, 0.50]
BASELINE_DELTA = OPTION_DELTA
BASELINE_EXPIRY = (EXPIRY_MIN_DAYS, EXPIRY_MAX_DAYS)
print("SENSITIVITY TO |DELTA| (expiry 9-15m)")
print("=" * 72)
print(f"{'|Delta|':<10} {'Cum= ret':=>10} {'Ann vol':>10} "
f"{'Sharpe':>10} {'Max DD':>10} {'Ach |d|':>8}")
print("-" * 72)
delta_results = {}
for d in DELTAS:
r = _safe_combo(f"{d:.2f}", d,
BASELINE_EXPIRY[0], BASELINE_EXPIRY[1])
delta_results[d] = r
_print_row(f"{d:.2f}", r, baseline=(d == BASELINE_DELTA))
cum_rets = [r['cum_ret'] for r in delta_results.values()
if r['cum_ret'] is not None]
print("-" * 72)
if cum_rets:
spread = max(cum_rets) - min(cum_rets)
print(f"{'Spread':<10} {fmt(spread):=>10} "
f"(range across delta choices)")
-- Expiry sweep (|delta| = 0.15) -----------------------------------
EXPIRIES = [
('3-6m', 90, 180),
('6-12m', 180, 365),
('9-15m', 270, 456),
('12-24m', 365, 730),
]
print()
print(f"SENSITIVITY TO EXPIRY WINDOW (|delta| = {BASELINE_DELTA:.2f})")
print("=" * 72)
print(f"{'Expiry':<10} {'Cum= ret':=>10} {'Ann vol':>10} "
f"{'Sharpe':>10} {'Max DD':>10} {'Ach |d|':>8}")
print("-" * 72)
expiry_results = {}
for label, mn, mx in EXPIRIES:
r = _safe_combo(label, BASELINE_DELTA, mn, mx)
expiry_results[label] = r
_print_row(label, r,
baseline=((mn, mx) == BASELINE_EXPIRY))
cum_rets = [r['cum_ret'] for r in expiry_results.values()
if r['cum_ret'] is not None]
print("-" * 72)
if cum_rets:
spread = max(cum_rets) - min(cum_rets)
print(f"{'Spread':<10} {fmt(spread):=>10} "
f"(range across expiry choices)")
print()
print("* = parameter combination used in the main backtest")
print("Ach |d| = mean |delta| of contracts actually selected; differs from "
"target when the chain is sparse")
```</div></details>
```text
SENSITIVITY TO |DELTA| (expiry 9-15m)
========================================================================
|Delta| Cum ret Ann vol Sharpe Max DD Ach |d|
------------------------------------------------------------------------
0.10 +114.29% 61.8% 1.30 -45.8% 0.18
0.15 * +115.17% 61.4% 1.31 -45.8% 0.16
0.25 +119.62% 61.2% 1.35 -45.8% 0.28
0.40 +123.35% 61.0% 1.37 -45.8% 0.41
0.50 +123.60% 60.6% 1.38 -45.8% 0.50
------------------------------------------------------------------------
Spread +9.32% (range across delta choices)
SENSITIVITY TO EXPIRY WINDOW (|delta| = 0.15)
========================================================================
Expiry Cum ret Ann vol Sharpe Max DD Ach |d|
------------------------------------------------------------------------
3-6m +103.95% 62.1% 1.23 -45.8% 0.16
6-12m +113.10% 61.6% 1.30 -45.8% 0.16
9-15m * +115.17% 61.4% 1.31 -45.8% 0.16
12-24m +117.97% 61.7% 1.33 -45.8% 0.16
------------------------------------------------------------------------
Spread +14.02% (range across expiry choices)
* = parameter combination used in the main backtest
Ach |d| = mean |delta| of contracts actually selected; differs from target when the chain is sparse
```
Three things are worth reading off the tables. First, the _spread_ row: the difference between the highest and lowest cumulative return across the grid. A large spread means the headline option-proxy number is not a point estimate but the centre of a wide band of plausible reconstructions. Second, the _achieved |delta|_ column: when the achieved mean is close to the target, the chain was dense enough to grant the requested selection; when it drifts (often upward, toward ATM), the chains were sparse and the proxy was forced closer to linear stock exposure than the sweep nominally requested. And third, the direction of the spread. For call-heavy books, deeper-OTM targets (lower |delta|) increase both the return and the volatility on large upside moves because low-delta calls are more convex; near-ATM targets behave more like the equity proxy, because higher-delta calls are closer to linear stock. Expiry effects are usually smaller but non-zero: shorter-dated contracts decay more within each ~90-day holding period, while long-dated contracts keep more time value across rolls.
If the delta spread is a large fraction of the headline return, the option proxy should be read as "a specific, conservative reading of what the disclosed options could have been worth" rather than as a single best estimate of the fund's realised performance.
Copycat delays {#copycat-delays}
Between one filing and the next (~90 days), the copycat holds a fixed portfolio while the fund’s actual portfolio evolves continuously. We only observe the fund’s positions at quarter-end snapshots; its actual holdings between snapshots are unknown. Furthermore, these snapshots are not published immediately, but after 45 days or so. These two delays—between the fund’s quarterly rebalance and quarter-end, and between quarter-end and filing date—create a gap where the copycat’s holdings are stale compared to the fund’s actual positions.
Let \\(Q\_i\\) denote the fund’s disclosed portfolio at the end of quarter \\(i\\). To estimate this cost, we can model the fund as switching from \\(Q\_{i-1}\\) to \\(Q\_i\\) at a single (unknown) point, uniformly distributed over the trading days in quarter \\(i\\).[^fn:4] Let \\(R(P, s, t)\\) denote the return of portfolio \\(P\\) from date \\(s\\) to date \\(t\\), and let \\(T\_i\\) denote the last day of the quarter \\(i\\). For each possible switch day \\(d\\), the delay cost is \\(R(Q\_i, d, T\_i) - R(Q\_{i-1}, d, T\_i)\\): the return the fund earned on its new positions that the copycat missed. Averaging over all \\(d\\) gives the expected intra-quarter delay cost.
The same logic extends to the ~45-day gap between quarter-end and filing date. During this period, the copycat still holds \\(Q\_{i-1}\\), while the fund may continue holding \\(Q\_i\\) or may have already started trading toward \\(Q\_{i+1}\\). We apply the same uniform-switch model over the full span of quarter \\(i+1\\): for each possible switch day during the gap, the fund earns a blend of \\(Q\_i\\) and \\(Q\_{i+1}\\) returns. We average over all scenarios—including those where the switch occurs after the gap—weighted by their probability.[^fn:5]<details><summary>Code</summary><div class="details"><a id="code-snippet--sa-delay"/>
```python
import json
import yfinance as yf
import pandas as pd
from datetime import datetime, timedelta
import numpy as np
import requests
import time
import os
import warnings
warnings.filterwarnings('ignore')
Parse data from the scraper block
parsed = json.loads(data) if isinstance(data, str) else data
filings = parsed["filings"]
Build internal structures
filing_dates = {f["quarter"]: f["filing_date"] for f in filings}
quarter_end_dates = {f["quarter"]: f["quarter_end"] for f in filings}
quarters = [f["quarter"] for f in filings]
Convert holdings list to dict keyed by quarter.
Multiple positions in the same ticker with different types are aggregated
by value per (ticker, type) pair.
holdings = {}
for f in filings:
positions = {}
for h in f["holdings"]:
ticker = h["ticker"]
pos_type = h["type"]
value = h["value"]
key = (ticker, pos_type)
positions[key] = positions.get(key, 0) + value
holdings[f["quarter"]] = positions
def _extract_close_series(df, ticker):
"""Extract a single close-price series from a yfinance result."""
if df.empty:
return pd.Series(dtype=float)
if isinstance(df.columns, pd.MultiIndex):
if 'Close' not in df.columns.get_level_values(0):
return pd.Series(dtype=float)
close = df['Close']
if isinstance(close, pd.DataFrame):
if ticker in close.columns:
series = close[ticker]
elif len(close.columns) == 1:
series = close.iloc[:, 0]
else:
return pd.Series(dtype=float)
else:
series = close
elif 'Close' in df.columns:
series = df['Close']
if isinstance(series, pd.DataFrame):
series = series.iloc[:, 0]
else:
return pd.Series(dtype=float)
return pd.to_numeric(series, errors='coerce').dropna()
def _download_close_series(ticker, start, end):
"""Download one ticker's close series; used to repair flaky batch misses."""
df = yf.download(ticker, start=start, end=end, progress=False,
auto_adjust=True)
return _extract_close_series(df, ticker)
def get_prices(tickers, dates):
"""Fetch close prices for tickers on specific dates."""
unique_tickers = sorted(set(tickers))
all_dates = [datetime.strptime(d, '%Y-%m-%d') for d in dates]
start = min(all_dates) - timedelta(days=5)
end = max(all_dates) + timedelta(days=5)
df = yf.download(unique_tickers, start=start, end=end, progress=False, auto_adjust=True)
yf.download returns MultiIndex columns (metric, ticker) for multiple tickers
if df.empty:
close = pd.DataFrame()
elif isinstance(df.columns, pd.MultiIndex) and 'Close' in df.columns.get_level_values(0):
close = df['Close'].copy()
elif 'Close' in df.columns:
close = df[['Close']].copy()
close.columns = unique_tickers
else:
close = pd.DataFrame()
prices = {}
for ticker in unique_tickers:
if ticker in close.columns:
series = pd.to_numeric(close[ticker], errors='coerce').dropna()
else:
series = pd.Series(dtype=float)
if series.empty:
series = _download_close_series(ticker, start, end)
if series.empty:
continue
prices[ticker] = {}
for date_str in dates:
target = pd.Timestamp(datetime.strptime(date_str, '%Y-%m-%d'))
after = series[series.index >= target]
if not after.empty:
prices[ticker][date_str] = float(after.iloc[0])
else:
before = series[series.index<= target]= if= not= before.empty:= prices[ticker][date_str]=float(before.iloc[-1]) return= prices= def= _price_on_or_after(px_by_date,= target_date):= """Return= (date,= price)= for= the= first= available= price= on/after= target."""= if= not= px_by_date:= return= None= dates=sorted(d for= d= in= px_by_date= if= d=>= target_date)
if not dates:
return None
d = dates[0]
return d, px_by_date[d]
def _price_on_or_before(px_by_date, target_date):
"""Return (date, price) for the last available price on/before target."""
if not px_by_date:
return None
dates = sorted(d for d in px_by_date if d<= target_date)= if= not= dates:= return= None= d=dates[-1] return= d,= px_by_date[d]= def= _period_price_pair(px_by_date,= start_date,= end_date):= """Return= start/end= prices= for= a= period= using= sensible= boundary= alignment."""= start=_price_on_or_after(px_by_date, start_date)= end=_price_on_or_before(px_by_date, end_date)= if= start= is= None= or= end= is= None:= return= None= start_actual,= p0=start end_actual,= p1=end if= end_actual= <= start_actual:= return= None= return= start_actual,= end_actual,= p0,= p1= def= _option_position_key(ticker,= pos_type):= return= (ticker,= pos_type)= def= _linear_underlying_sign(pos_type):= """Direction= when= option= rows= are= converted= to= underlying= equity= exposure."""= return= -1= if= pos_type== 'put'= else= 1= def= compute_return(positions,= prices,= start_date,= end_date,= mode='equity_only' ,= option_prices=None): """Compute= portfolio= return= between= two= dates.= The= 13F= value= for= an= option= row= is= treated= as= underlying= notional,= not= option= premium.= Option= contracts= are= sized= from= that= notional,= but= the= portfolio= denominator= is= estimated= deployed= capital:= stock= value= plus= option= premium= cost.= This= avoids= treating= the= gap= between= option= notional= and= option= premium= as= cash.= In= 'full'= mode,= every= option= row= requires= a= MarketData= price= series;= missing= data= raises= rather= than= falling= back.= """= total_cost=0 portfolio_pnl=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= stock_px=prices.get(ticker) if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= total_cost= +=value portfolio_pnl= +=value *= _linear_underlying_sign(pos_type)= *= stock_ret= continue= if= is_option:= opt_key=_option_position_key(ticker, pos_type)= opt_px=option_prices.get(opt_key) if= option_prices= else= None= if= not= opt_px:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= period= "= f"{start_date}..{end_date}")= pair=_period_price_pair(opt_px, start_date,= end_date)= if= pair= is= None:= raise= RuntimeError(= f"MarketData= option= price= series= for= {opt_key}= does= not= "= f"cover= {start_date}..{end_date}")= start_actual,= end_actual,= opt_p0,= opt_p1=pair stock_start=_price_on_or_after(stock_px, start_actual)= if= stock_start= is= None= or= stock_start[1]= <=0: stock_start=_price_on_or_after(stock_px, start_date)= if= stock_start= is= None= or= stock_start[1]= <=0: raise= RuntimeError(= f"No= underlying= price= for= {ticker}= at= {start_date}")= p0,= p1=opt_p0, opt_p1= underlying_p0=stock_start[1] if= p0= <=0 or= underlying_p0= <=0: continue= position_cost=value *= (p0= /= underlying_p0)= position_pnl=value *= ((p1= -= p0)= /= underlying_p0)= else:= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= position_cost=value position_pnl=value *= stock_ret= if= position_cost= <=0: continue= total_cost= +=position_cost portfolio_pnl= +=position_pnl return= portfolio_pnl= /= total_cost= if= total_cost= else= None= def= annualize(ret,= days):= """Annualize= a= return= over= a= given= number= of= calendar= days."""= if= ret= is= None= or= days= <=0: return= None= return= (1= += ret)= **= (365.25= /= days)= -= 1= def= fmt(ret):= return= f"{ret= *= 100:+.2f}%"= if= ret= is= not= None= else= "N/A"= Collect= all= tickers= and= dates= all_tickers=set() for= positions= in= holdings.values():= for= (ticker,= _)= in= positions:= all_tickers.add(ticker)= all_tickers.add('SPY')= today=datetime.now().strftime('%Y-%m-%d') first_date=filing_dates[quarters[0]] all_dates=set(filing_dates.values()) |= set(quarter_end_dates.values())= |= {today}= prices=get_prices(sorted(all_tickers), sorted(all_dates))= Resolve= `today`= to= the= actual= last= available= closing= date.= yfinance= may= not= have= data= for= today= (market= still= open= or= holiday),= so= we= look= up= what= date= SPY's= price= actually= corresponds= to.= def= _resolve_price_date(prices,= requested_date):= """Return= the= actual= trading= date= of= the= price= stored= under= requested_date."""= ref='SPY' if= 'SPY'= in= prices= else= next(iter(prices),= None)= if= not= ref= or= requested_date= not= in= prices[ref]:= return= requested_date= target_price=prices[ref][requested_date] Re-download= a= small= window= to= find= the= real= date= of= this= price= start=datetime.strptime(requested_date, '%Y-%m-%d')= -= timedelta(days=10) end=datetime.strptime(requested_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(ref, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= return= requested_date= if= isinstance(df.columns,= pd.MultiIndex):= close=df['Close'][ref].dropna() elif= 'Close'= in= df.columns:= close=df['Close'].dropna() else:= close=df.iloc[:, 0].dropna()= for= dt,= px= in= close.items():= val=float(px.iloc[0]) if= isinstance(px,= pd.Series)= else= float(px)= if= abs(val= -= target_price)= <= 0.01:= ts=dt[0] if= isinstance(dt,= tuple)= else= dt= return= pd.Timestamp(ts).strftime('%Y-%m-%d')= return= requested_date= today_resolved=_resolve_price_date(prices, today)= if= today_resolved= !=today: for= ticker= in= prices:= if= today= in= prices[ticker]:= prices[ticker][today_resolved]=prices[ticker].pop(today) today=today_resolved def= download_daily(tickers,= start_date,= end_date):= """Download= daily= close= prices= from= yfinance,= handling= MultiIndex.= Dates= are= 'YYYY-MM-DD'= strings.= Adds= a= small= buffer= for= trading-day= alignment."""= tickers_sorted=sorted(tickers) start=datetime.strptime(start_date, '%Y-%m-%d')= -= timedelta(days=5) end=datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(tickers_sorted, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= close=pd.DataFrame() elif= isinstance(df.columns,= pd.MultiIndex)= and= 'Close'= in= df.columns.get_level_values(0):= close=df['Close'].copy() elif= 'Close'= in= df.columns:= close=df[['Close']].copy() close.columns=tickers_sorted else:= close=pd.DataFrame() for= ticker= in= tickers_sorted:= if= ticker= in= close.columns= and= not= close[ticker].dropna().empty:= continue= series=_download_close_series(ticker, start,= end)= if= not= series.empty:= close[ticker]=series return= close.sort_index()= --= Historical= option= prices= via= MarketData= --------------------------------= OPTION_CACHE_DIR=os.path.expanduser('~/My Drive/notes/.sa-lp-option-cache')= _MD_BASE='https://api.marketdata.app/v1' _MD_RATE_DELAY=0.15 OPTION_CACHE_COLUMNS=[ 'date',= 'selected_on',= 'option_type',= 'symbol',= 'strike',= 'expiry',= 'delta',= 'price']= Default= contract= selection= parameters.= The= option= proxy= picks= a= contract= matching= option= type,= with= expiry= between= min_days= and= max_days= of= the= period= start,= and= |delta|= closest= to= delta_target.= When= the= chain= is= sparse,= the= achieved= |delta|= may= be= far= from= the= target;= the= sensitivity= block= reports= achieved= |delta|= so= this= is= visible= rather= than= silent.= OPTION_DELTA=0.15 EXPIRY_MIN_DAYS=270 #= ~9= months= EXPIRY_MAX_DAYS=456 #= ~15= months= def= _normalize_option_type(option_type):= option_type=str(option_type).lower() if= option_type= not= in= ('call',= 'put'):= raise= ValueError(f"Unsupported= option= type:= {option_type}")= return= option_type= def= _empty_option_cache():= return= pd.DataFrame(columns=OPTION_CACHE_COLUMNS) def= _option_cache_path(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Return= the= cache= CSV= path= for= (ticker,= type,= delta_target,= window).= When= the= parameter= triple= equals= the= baseline= (0.15,= 270-456= days),= the= historical= filename= ``TICKER-TYPE.csv``= is= used= so= the= main-backtest= cache= is= reused= automatically.= Any= non-baseline= combo= lives= in= a= separate= ``TICKER-TYPE-d<delta=>-e<min>-<max>.csv`` file so a sensitivity
sweep never pollutes the baseline cache (which the portfolio calculator
reads to pick the representative contract for the current filing).
"""
option_type = _normalize_option_type(option_type)
is_baseline = (
abs(delta_target - OPTION_DELTA)< 1e-9= and= min_days== EXPIRY_MIN_DAYS= and= max_days== EXPIRY_MAX_DAYS)= if= is_baseline:= return= os.path.join(OPTION_CACHE_DIR,= f'{ticker}-{option_type}.csv')= return= os.path.join(= OPTION_CACHE_DIR,= f'{ticker}-{option_type}-d{delta_target:g}-e{min_days}-{max_days}.csv')= def= _load_option_cache(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Load= cached= MarketData= rows= for= a= ticker/type/target/window.= Returns= DataFrame= or= empty."""= option_type=_normalize_option_type(option_type) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= not= os.path.exists(path):= return= _empty_option_cache()= df=pd.read_csv(path) if= df.empty:= return= _empty_option_cache()= for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan for= col= in= ('date',= 'selected_on'):= df[col]=pd.to_datetime( df[col],= errors='coerce' ).dt.strftime('%Y-%m-%d')= df['option_type']=df['option_type'].fillna(option_type).str.lower() cache=df[OPTION_CACHE_COLUMNS].copy() cache=cache[cache['option_type'] == option_type].copy()= cache.dropna(subset=['date'], inplace=True) for= col= in= ('strike',= 'delta',= 'price'):= cache[col]=pd.to_numeric(cache[col], errors='coerce' )= cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) return= cache[OPTION_CACHE_COLUMNS]= def= _save_option_cache(ticker,= option_type,= df,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Persist= typed= option= cache= to= CSV."""= option_type=_normalize_option_type(option_type) os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= df.empty:= df=_empty_option_cache() else:= df=df.copy() df['option_type']=option_type for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan df.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) df.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) df.to_csv(path,= index=False) def= _contract_window(ref_date_str,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= return= ref= += timedelta(days=min_days), ref= += timedelta(days=max_days) def= _contract_from_cache_row(row,= ref_date_str,= option_type,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) if= str(row.get('option_type',= option_type)).lower()= !=option_type: return= None= lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= try:= exp=datetime.strptime(str(row['expiry']), '%Y-%m-%d')= except= (KeyError,= TypeError,= ValueError):= return= None= if= not= (lo= <=exp <=hi): return= None= strike=_safe_float(row.get('strike')) delta=_safe_float(row.get('delta')) price=_safe_float(row.get('price')) if= strike= is= None= or= delta= is= None= or= price= is= None= or= price= <=0: return= None= return= {= 'selected_on':= row.get('selected_on'),= 'option_type':= option_type,= 'symbol':= row.get('symbol'),= 'strike':= strike,= 'expiry':= str(row['expiry']),= 'delta':= delta,= 'price':= price,= }= def= _select_cached_contract(cache,= option_type,= ref_date_str,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS, require_selected=False): rows=cache[(cache['date'] == ref_date_str)= &= (cache['option_type']== option_type)]= selected_rows=rows[rows['selected_on'] == ref_date_str]= if= not= selected_rows.empty:= rows=selected_rows elif= require_selected:= rows=selected_rows candidates=[] for= _,= row= in= rows.iterrows():= contract=_contract_from_cache_row(row, ref_date_str,= option_type,= min_days,= max_days)= if= contract:= candidates.append(contract)= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= _parse_option_price(contract):= """Extract= a= mark= price= from= an= option= contract= record."""= mid=_safe_float(contract.get('mid')) if= mid= and= mid=> 0:
return mid
bid = _safe_float(contract.get('bid'))
ask = _safe_float(contract.get('ask'))
last = _safe_float(contract.get('last'))
if bid and ask and bid > 0 and ask > 0:
return (bid + ask) / 2
if last and last > 0:
return last
return None
def _safe_float(val):
try:
out = float(val)
if np.isnan(out):
return None
return out
except (TypeError, ValueError):
return None
def _marketdata_key():
"""Return the MarketData API key, or None if unavailable.
Resolution order:
1. ``MARKETDATA_KEY`` / ``MARKETDATA_API_KEY`` environment variables.
2. ``pass env/marketdata-token`` (local ``pass`` store).
The result is memoised on the function object so repeated lookups
during a sweep do not reshell. Fetch helpers raise themselves when
called without a key, so a fully cached run still succeeds without
requiring either source.
"""
if hasattr(_marketdata_key, '_cached'):
return _marketdata_key._cached
key = (os.environ.get('MARKETDATA_KEY', '')
or os.environ.get('MARKETDATA_API_KEY', ''))
if not key:
try:
import subprocess
out = subprocess.run(
['pass', 'show', 'env/marketdata-token'],
capture_output=True, text=True, timeout=5, check=False)
if out.returncode == 0:
key = out.stdout.strip().splitlines()[0] if out.stdout else ''
except (FileNotFoundError, subprocess.TimeoutExpired):
key = ''
_marketdata_key._cached = key or None
return _marketdata_key._cached
def _marketdata_get(path, params, api_key):
"""Fetch a MarketData endpoint, returning normalized row dictionaries.
Raises on HTTP errors or a non-'ok' status. 'no_data' is returned as
an empty list so that callers can distinguish 'nothing available' from
'request failed'.
"""
headers = {'Accept': 'application/json', 'Authorization': f'Bearer {api_key}'}
resp = requests.get(_MD_BASE + path, params=params, headers=headers,
timeout=30)
resp.raise_for_status()
body = resp.json()
status = body.get('s')
if status == 'no_data':
return []
if status != 'ok':
raise RuntimeError(
f"MarketData {path} returned status={status!r}: "
f"{body.get('errmsg') or body}")
lengths = [len(v) for v in body.values() if isinstance(v, list)]
n = max(lengths) if lengths else 0
rows = []
for i in range(n):
row = {}
for key, val in body.items():
if isinstance(val, list):
row[key] = val[i] if i< len(val)= else= None= else:= row[key]=val rows.append(row)= return= rows= def= _marketdata_date(timestamp):= try:= return= datetime.utcfromtimestamp(int(timestamp)).strftime('%Y-%m-%d')= except= (TypeError,= ValueError,= OSError):= return= None= def= _occ_symbol(ticker,= option_type,= strike,= expiry):= """Build= a= standard= OCC= option= symbol= from= contract= fields."""= cp='C' if= _normalize_option_type(option_type)== 'call'= else= 'P'= exp=datetime.strptime(str(expiry), '%Y-%m-%d').strftime('%y%m%d')= strike_int=int(round(float(strike) *= 1000))= root=ticker.upper().replace('.', '')= return= f'{root}{exp}{cp}{strike_int:08d}'= Chains= are= always= fetched= with= a= broad= expiry= window= so= they= can= be= cached= and= reused= for= in-memory= selection= across= any= (delta_target,= expiry= window)= combination= in= the= sensitivity= sweep.= CHAIN_FETCH_MIN_DAYS=30 CHAIN_FETCH_MAX_DAYS=760 def= _fetch_marketdata_chain(ticker,= date_str,= option_type,= api_key,= min_days=CHAIN_FETCH_MIN_DAYS, max_days=CHAIN_FETCH_MAX_DAYS): lo,= hi=_contract_window(date_str, min_days,= max_days)= params={ 'date':= date_str,= 'from':= lo.strftime('%Y-%m-%d'),= 'to':= hi.strftime('%Y-%m-%d'),= 'side':= _normalize_option_type(option_type),= 'expiration':= 'all',= }= return= _marketdata_get(f'/options/chain/{ticker}/',= params,= api_key)= Chain= cache:= one= CSV= per= (ticker,= type,= date)= storing= the= broad-window= chain.= Lets= the= sensitivity= sweep= re-select= contracts= for= different= delta= targets= and= expiry= windows= without= refetching.= CHAIN_CACHE_DIR=os.path.join(OPTION_CACHE_DIR, 'chains')= def= _chain_cache_path(ticker,= option_type,= date_str):= option_type=_normalize_option_type(option_type) return= os.path.join(CHAIN_CACHE_DIR,= f'{ticker}-{option_type}-{date_str}.csv')= def= _load_chain_cache(ticker,= option_type,= date_str):= path=_chain_cache_path(ticker, option_type,= date_str)= if= not= os.path.exists(path):= return= None= df=pd.read_csv(path) if= df.empty:= return= []= return= df.to_dict('records')= def= _save_chain_cache(ticker,= option_type,= date_str,= chain):= if= not= chain:= return= os.makedirs(CHAIN_CACHE_DIR,= exist_ok=True) path=_chain_cache_path(ticker, option_type,= date_str)= pd.DataFrame(chain).to_csv(path,= index=False) def= _get_or_fetch_chain(ticker,= date_str,= option_type,= api_key,= fetched_counter=None): """Return= the= cached= broad= chain= for= (ticker,= type,= date),= fetching= if= absent.= Requires= ``api_key``= only= when= a= fetch= is= actually= needed.= """= chain=_load_chain_cache(ticker, option_type,= date_str)= if= chain= is= not= None:= return= chain= if= not= api_key:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= but= a= chain= fetch= is= required= for= "= f"{ticker}= {option_type}= on= {date_str}.")= time.sleep(_MD_RATE_DELAY)= chain=_fetch_marketdata_chain(ticker, date_str,= option_type,= api_key)= if= fetched_counter= is= not= None:= fetched_counter['marketdata_chains']= +=1 _save_chain_cache(ticker,= option_type,= date_str,= chain)= return= chain= def= _fetch_marketdata_quotes(symbol,= start_date,= end_date,= api_key):= to_date=(datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=1)).strftime('%Y-%m-%d') rows=_marketdata_get(f'/options/quotes/{symbol}/', {'from':= start_date,= 'to':= to_date},= api_key)= prices={} for= row= in= rows:= date_str=_marketdata_date(row.get('updated')) if= not= date_str:= continue= price=_parse_option_price(row) if= price= is= not= None= and= price=> 0:
prices[date_str] = price
return prices
def _implied_vol_from_price(S, K, T, option_price, option_type):
"""Infer Black-Scholes volatility from an observed option mid price."""
if any(x is None for x in (S, K, T, option_price)):
return None
if S<= 0= or= K= <=0 or= T= <=0 or= option_price= <=0: return= None= intrinsic=max(S -= K,= 0)= if= option_type== 'call'= else= max(K= -= S,= 0)= upper=S if= option_type== 'call'= else= K= if= option_price= <= intrinsic= -= 1e-6= or= option_price=> upper * 1.5:
return None
lo, hi = 1e-4, 5.0
try:
if (option_price< bs_price(S,= K,= T,= lo,= option_type)= -= 1e-4= or= option_price=> bs_price(S, K, T, hi, option_type) + 1e-4):
return None
for _ in range(80):
mid = (lo + hi) / 2
if bs_price(S, K, T, mid, option_type)< option_price:= lo=mid else:= hi=mid return= (lo= += hi)= /= 2= except= (FloatingPointError,= ValueError,= ZeroDivisionError):= return= None= def= _marketdata_delta(row,= ref_date_str,= expiry,= option_type,= price):= """Use= vendor= delta= when= present;= otherwise= infer= it= from= the= quote."""= native=_safe_float(row.get('delta')) if= native= is= not= None= and= native= !=0: return= native= S=_safe_float(row.get('underlyingPrice')) K=_safe_float(row.get('strike')) ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= exp=datetime.strptime(expiry, '%Y-%m-%d')= T=max((exp -= ref).days= /= 365.25,= 1e-6)= sigma=_safe_float(row.get('iv')) if= sigma= is= None= or= sigma= <=0: sigma=_implied_vol_from_price(S, K,= T,= price,= option_type)= if= S= is= None= or= K= is= None= or= sigma= is= None= or= sigma= <=0: return= None= return= bs_delta(S,= K,= T,= sigma,= option_type)= def= _select_marketdata_contract(chain,= ref_date_str,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= candidates=[] for= c= in= chain:= if= str(c.get('side',= '')).lower()= !=option_type: continue= expiry=_marketdata_date(c.get('expiration')) if= not= expiry:= continue= exp=datetime.strptime(expiry, '%Y-%m-%d')= if= not= (lo= <=exp <=hi): continue= price=_parse_option_price(c) if= price= is= None= or= price= <=0: continue= delta=_marketdata_delta(c, ref_date_str,= expiry,= option_type,= price)= if= delta= is= None= or= delta== 0:= continue= strike=_safe_float(c.get('strike')) symbol=c.get('optionSymbol') if= strike= is= None= or= not= symbol:= continue= candidates.append({= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= delta,= 'price':= price,= })= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= download_option_prices(option_positions,= quarters,= holdings,= filing_dates,= today,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Download= historical= representative= option= prices= from= MarketData.= MarketData= is= the= sole= supported= provider.= MARKETDATA_KEY= must= be= set.= For= each= (ticker,= option_type)= and= each= filing= period= in= which= that= position= is= held:= 1.= On= the= first= trading= day,= select= a= contract= matching= type,= with= expiry= between= ``min_days``= and= ``max_days``= of= the= period= start,= and= |delta|= closest= to= ``delta_target``.= MarketData's= Starter= plan= often= returns= null= Greeks,= so= delta= is= inferred= from= the= observed= mid= price= via= Black-Scholes= when= the= vendor= delta= is= missing.= 2.= Lock= in= that= contract= for= the= period.= 3.= Track= its= historical= mid= price= through= the= period.= The= broad= option= chain= for= each= (ticker,= type,= first_day)= is= cached= to= disk= so= that= sensitivity= sweeps= over= (delta_target,= expiry= window)= reuse= a= single= fetch.= Raises= ``RuntimeError``= if= no= suitable= contract= can= be= selected= for= any= required= (ticker,= type,= period),= or= if= MarketData= returns= no= price= series= for= the= selected= contract.= Parameters= ----------= delta_target= := float= Target= |delta|= for= contract= selection= (default= ``OPTION_DELTA``).= min_days,= max_days= := int= Contract= expiry= window= in= days= from= period= start= (default= 270-456,= i.e.= 9-15= months).= Returns= -------= per_period= := dict= {quarter_str:= {(ticker,= type):= {date_str:= float}}}= Option= prices= keyed= by= filing= period= then= option= position.= Each= period= has= its= own= contract's= prices.= """= option_positions=sorted({ (ticker,= _normalize_option_type(pos_type))= for= ticker,= pos_type= in= option_positions})= md_key=_marketdata_key() os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) per_period={} #= {q:= {(ticker,= type):= {date_str:= price}}}= fetched={'marketdata_chains': 0,= 'marketdata_quotes':= 0}= for= ticker,= option_type= in= option_positions:= opt_key=_option_position_key(ticker, option_type)= cache=_load_option_cache(ticker, option_type,= delta_target,= min_days,= max_days)= new_rows=[] for= i,= q= in= enumerate(quarters):= Skip= quarters= where= this= exact= option= position= is= absent.= if= opt_key= not= in= holdings[q]:= continue= period_start=filing_dates[q] period_end=(filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today)= trading_days=pd.bdate_range(period_start, period_end)= if= len(trading_days)== 0:= continue= first_day=trading_days[0].strftime('%Y-%m-%d') --= Select= contract= on= first= trading= day= --= contract=_select_cached_contract( cache,= option_type,= first_day,= delta_target=delta_target, min_days=min_days, max_days=max_days, require_selected=True) if= contract= is= None:= chain=_get_or_fetch_chain( ticker,= first_day,= option_type,= md_key,= fetched)= contract=_select_marketdata_contract( chain,= first_day,= option_type,= delta_target=delta_target, min_days=min_days, max_days=max_days) if= contract= is= None:= raise= RuntimeError(= f"MarketData= returned= no= usable= {option_type}= contract= "= f"for= {ticker}= on= {first_day}= (period= {q})= at= "= f"delta={delta_target}, "= f"expiry= {min_days}-{max_days}d")= new_rows.append({= 'date':= first_day,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= contract.get('symbol'),= 'strike':= contract['strike'],= 'expiry':= contract['expiry'],= 'delta':= contract['delta'],= 'price':= contract['price'],= })= strike=contract['strike'] expiry=contract['expiry'] symbol=contract.get('symbol') or= _occ_symbol(= ticker,= option_type,= strike,= expiry)= --= Collect= prices= for= this= period= (fresh= dict= per= period)= --= period_prices={} Fast= path:= read= matching= prices= from= cache.= rows=cache[ (cache['date']=>= period_start)
& (cache['date']<= period_end)= &= (cache['option_type']== option_type)= &= (abs(cache['strike']= -= strike)= <= 0.01)= &= (cache['expiry'].astype(str)== str(expiry))= &= pd.notna(cache['price'])]= selected_rows=rows[rows['selected_on'] == first_day]= if= not= selected_rows.empty:= rows=selected_rows for= _,= row= in= rows.iterrows():= period_prices[row['date']]=float(row['price']) Decide= whether= to= refresh= quotes.= With= a= key,= refresh= whenever= the= cached= series= does= not= reach= period_end.= Without= a= key,= only= fail= if= the= cached= series= is= empty;= a= slightly= stale= tail= is= acceptable= for= cache-only= runs= (e.g.= sensitivity= sweeps= replaying= the= baseline= contract).= has_partial=bool(period_prices) reaches_end=has_partial and= max(period_prices)=>= period_end
if md_key and not reaches_end:
time.sleep(_MD_RATE_DELAY)
quote_prices = _fetch_marketdata_quotes(
symbol, period_start, period_end, md_key)
fetched['marketdata_quotes'] += 1
for day_str, price in quote_prices.items():
if period_start<= day_str= <=period_end: period_prices[day_str]=price new_rows.append({= 'date':= day_str,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= contract['delta'],= 'price':= price,= })= if= contract.get('price')= and= first_day= not= in= period_prices:= period_prices[first_day]=contract['price'] elif= not= md_key= and= not= has_partial:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= and= no= cached= quotes= exist= "= f"for= {symbol}= in= {period_start}..{period_end}.")= if= not= period_prices:= raise= RuntimeError(= f"MarketData= returned= no= quotes= for= {symbol}= "= f"({opt_key})= in= {period_start}..{period_end}")= per_period.setdefault(q,= {})[opt_key]=period_prices Persist= new= data= to= cache= if= new_rows:= new_df=pd.DataFrame(new_rows) cache=pd.concat([cache, new_df],= ignore_index=True) cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) _save_option_cache(ticker,= option_type,= cache,= delta_target,= min_days,= max_days)= if= any(fetched.values()):= import= sys= parts=[] if= fetched['marketdata_chains']:= parts.append(f"{fetched['marketdata_chains']}= MarketData= chains")= if= fetched['marketdata_quotes']:= parts.append(f"{fetched['marketdata_quotes']}= MarketData= quote= series")= print(f"[options]= Fetched= {',= '.join(parts)}",= file=sys.stderr) return= per_period= --= Black-Scholes= helpers= (used= only= to= infer= delta= when= MarketData's= Starter-plan= historical= Greeks= are= null;= never= to= reprice= returns)= -----= from= scipy.stats= import= norm= as= _norm= def= bs_price(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= option= price= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= max(S= -= K,= 0)= return= max(K= -= S,= 0)= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= d2=d1 -= sigma= *= np.sqrt(T)= if= option_type== 'call':= return= S= *= _norm.cdf(d1)= -= K= *= _norm.cdf(d2)= return= K= *= _norm.cdf(-d2)= -= S= *= _norm.cdf(-d1)= def= bs_delta(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= delta= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= 1.0= if= S=> K else 0.0
return -1.0 if S< K= else= 0.0= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= if= option_type== 'call':= return= _norm.cdf(d1)= return= _norm.cdf(d1)= -= 1= def= daily_cumulative(holdings,= quarters,= filing_dates,= close,= today,= mode,= per_period_opt=None): """Build= a= daily= series= of= cumulative= growth= factors= for= a= given= mode.= For= each= filing= period,= stock= shares= and= option= contracts= are= fixed.= In= equity-proxy= mode,= option= rows= are= converted= to= linear= underlying= exposure:= calls= are= long= underlying= and= puts= are= short= underlying.= In= option-proxy= mode,= option= rows= are= sized= by= 13F= underlying= notional= and= returns= come= from= MarketData= quotes;= returns= are= divided= by= deployed= capital= (stock= value= plus= option= premium= cost).= Option-proxy= mode= raises= if= MarketData= prices= are= missing= for= any= required= position.= """= cum_growth=1.0 dates_out=[] values_out=[] for= i,= q= in= enumerate(quarters):= period_start=filing_dates[q] period_end=filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today= ps=pd.Timestamp(period_start) pe=pd.Timestamp(period_end) Trading= days= in= this= period= mask=(close.index>= ps) & (close.index<= pe)= period_close=close[mask] if= period_close.empty:= continue= Option= prices= for= this= period= (keyed= by= (ticker,= type)= →= prices)= quarter_opt=per_period_opt.get(q, {})= if= per_period_opt= else= {}= Determine= starting= prices,= fixed= exposure,= and= deployed= capital.= positions=holdings[q] exposure={} costs={} start_prices={} start_underlying={} use_opt_px={} #= track= which= positions= use= option= prices= total_cost=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= opt_key=_option_position_key(ticker, pos_type)= if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value continue= mode== 'full'= (option= proxy)= if= is_option:= if= opt_key= not= in= quarter_opt:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= "= f"period= {q}")= ticker_opt=quarter_opt[opt_key] opt_dates=sorted(d for= d= in= ticker_opt= if= d=>= period_start)
if not opt_dates:
raise RuntimeError(
f"MarketData option prices for {opt_key} in period "
f"{q} contain no dates at or after {period_start}")
if ticker not in close.columns:
raise RuntimeError(
f"No underlying close series for {ticker}")
src = close[ticker].dropna()
avail = src[src.index >= ps]
if avail.empty:
raise RuntimeError(
f"No underlying price for {ticker} at {period_start}")
opt_start = ticker_opt[opt_dates[0]]
underlying_start = float(avail.iloc[0])
if opt_start<= 0= or= underlying_start= <=0: raise= RuntimeError(= f"Non-positive= starting= price= for= {opt_key}= in= "= f"period= {q}")= start_prices[(ticker,= pos_type)]=opt_start start_underlying[(ticker,= pos_type)]=underlying_start costs[(ticker,= pos_type)]=value *= opt_start= /= underlying_start= exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=True total_cost= +=costs[(ticker, pos_type)]= continue= Plain= stock= in= full= mode= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value if= total_cost== 0:= continue= Daily= P&L= relative= to= period= start.= Skip= first= day= of= subsequent= periods= (already= recorded= as= last= day= of= the= prior= period)= to= avoid= duplicate= boundary= dates.= start_idx=1 if= i=> 0 else 0
Forward-fill: track last known option price so that gaps in
option data don't cause positions to vanish mid-period.
last_opt = {k: v for k, v in start_prices.items()
if use_opt_px.get(k)}
for day_idx in range(start_idx, len(period_close)):
day = period_close.index[day_idx]
day_str = day.strftime('%Y-%m-%d')
period_pnl = 0
for (ticker, pos_type), value in exposure.items():
p0 = start_prices[(ticker, pos_type)]
if p0 == 0:
continue
if use_opt_px[(ticker, pos_type)]:
opt_key = _option_position_key(ticker, pos_type)
p1_val = quarter_opt.get(opt_key, {}).get(day_str)
if p1_val is not None:
last_opt[(ticker, pos_type)] = p1_val
else:
p1_val = last_opt.get((ticker, pos_type))
if p1_val is None:
continue
underlying_p0 = start_underlying.get((ticker, pos_type))
if not underlying_p0 or underlying_p0<= 0:= continue= position_pnl=value *= (float(p1_val)= -= p0)= /= underlying_p0= else:= if= ticker= not= in= period_close.columns:= continue= p1_val=period_close[ticker].iloc[day_idx] if= pd.isna(p1_val):= continue= stock_ret=(float(p1_val) -= p0)= /= p0= if= mode== 'equity_only':= position_pnl=( value= *= _linear_underlying_sign(pos_type)= *= stock_ret)= else:= position_pnl=value *= stock_ret= period_pnl= +=position_pnl dates_out.append(day)= values_out.append(cum_growth= *= (1= += period_pnl= /= total_cost))= Chain:= next= period= starts= from= the= last= day's= growth= factor= if= values_out:= cum_growth=values_out[-1] return= dates_out,= values_out= first_qe=quarter_end_dates[quarters[0]] last_fd=filing_dates[quarters[-1]] dc=download_daily(sorted(all_tickers), first_qe,= last_fd)= def= pf_ret(positions,= close_df,= i0,= i1,= mode='equity_only' ):= """Weighted= portfolio= return= between= index= positions= i0= and= i1."""= if= i0=>= i1:
return 0.0
total_value = 0
weighted_return = 0
for (ticker, pos_type), value in positions.items():
if mode == 'equity_only' and pos_type not in ('long', 'call', 'put'):
continue
if ticker not in close_df.columns:
continue
p0 = close_df[ticker].iloc[i0]
p1 = close_df[ticker].iloc[i1]
if pd.isna(p0) or pd.isna(p1) or float(p0) == 0:
continue
ret = (float(p1) - float(p0)) / float(p0)
sign = _linear_underlying_sign(pos_type)
total_value += value
weighted_return += value * sign * ret
return weighted_return / total_value if total_value else None
def slice_dc(start_date, end_date):
"""Slice the daily close DataFrame to a date range (inclusive)."""
return dc[(dc.index >= pd.Timestamp(start_date))
& (dc.index<= pd.Timestamp(end_date))]= ──= Unified= delay= cost= analysis= ──────────────────────────────────────= The= copycat= holds= Q_{i-1}= from= filing_{i-1}= to= filing_i.= We= model= the= fund= as= switching= from= Q_{i-1}= to= Q_i= uniformly= during= quarter= i,= and= from= Q_i= to= Q_{i+1}= uniformly= during= quarter= i+1.= print("COPYCAT= DELAY= COST= (equity= proxy,= uniform= switch= model)")= print("=" * 57) print(f" {'Transition':<19}= {'Intra-Q':=>10} {'Gap':>10} {'Total':>10}")
print("-" * 57)
cum_intra = 1.0
cum_gap = 1.0
cum_total = 1.0
for i in range(1, len(quarters)):
prev_q = quarters[i - 1]
curr_q = quarters[i]
next_q = quarters[i + 1] if i + 1< len(quarters)= else= None= qe_prev=quarter_end_dates[prev_q] qe_curr=quarter_end_dates[curr_q] fd_curr=filing_dates[curr_q] ──= Intra-quarter:= qe[i-1]= to= qe[i]= ─────────────────────────= For= each= trading= day= d,= compute= R(Q_i,= d,= T)= -= R(Q_{i-1},= d,= T).= pc_q=slice_dc(qe_prev, qe_curr)= N_q=len(pc_q) -= 1= intra_costs=[] for= d= in= range(N_q):= r_new=pf_ret(holdings[curr_q], pc_q,= d,= N_q)= r_old=pf_ret(holdings[prev_q], pc_q,= d,= N_q)= if= r_new= is= not= None= and= r_old= is= not= None:= intra_costs.append(r_new= -= r_old)= avg_intra=sum(intra_costs) /= len(intra_costs)= if= intra_costs= else= None= ──= Gap:= qe[i]= to= filing[i]= ─────────────────────────────────= The= fund= may= hold= Q_i= or= may= have= already= switched= to= Q_{i+1}.= pc_g=slice_dc(qe_curr, fd_curr)= M=len(pc_g) -= 1= #= trading= days= in= the= gap= if= M= <=0: avg_gap=0.0 elif= next_q= is= not= None:= Count= trading= days= in= quarter= i+1= for= probabilities= qe_next=quarter_end_dates[next_q] pc_full=slice_dc(qe_curr, qe_next)= N_full=len(pc_full) -= 1= if= len(pc_full)=> 1 else 63
r_copy = pf_ret(holdings[prev_q], pc_g, 0, M)
if r_copy is None:
avg_gap = None
else:
No-switch: fund holds Q_i for entire gap (switch after gap)
r_qi = pf_ret(holdings[curr_q], pc_g, 0, M)
no_switch = (r_qi - r_copy) if r_qi is not None else None
Switch on day d: fund holds Q_i for [0, d], Q_{i+1} for [d, M]
switch_costs = []
for d in range(1, M + 1):
r_a = pf_ret(holdings[curr_q], pc_g, 0, d)
r_b = pf_ret(holdings[next_q], pc_g, d, M)
if r_a is not None and r_b is not None:
r_fund = (1 + r_a) * (1 + r_b) - 1
switch_costs.append(r_fund - r_copy)
p_no = (N_full - M) / N_full
p_each = 1 / N_full
if no_switch is not None:
avg_gap = p_no * no_switch + p_each * sum(switch_costs)
elif switch_costs:
avg_gap = p_each * sum(switch_costs)
else:
avg_gap = None
else:
Q_{i+1} not available: simple Q_i vs Q_{i-1}
r_qi = pf_ret(holdings[curr_q], pc_g, 0, M)
r_old = pf_ret(holdings[prev_q], pc_g, 0, M)
if r_qi is not None and r_old is not None:
avg_gap = r_qi - r_old
else:
avg_gap = None
── Total ────────────────────────────────────────────────────
if avg_intra is not None and avg_gap is not None:
total = (1 + avg_intra) * (1 + avg_gap) - 1
elif avg_intra is not None:
total = avg_intra
elif avg_gap is not None:
total = avg_gap
else:
total = None
if avg_intra is not None:
cum_intra *= (1 + avg_intra)
if avg_gap is not None:
cum_gap *= (1 + avg_gap)
if total is not None:
cum_total *= (1 + total)
suffix = " \u2020" if next_q is None else ""
label = f"{prev_q} \u2192 {curr_q}"
print(f"{label + suffix:<19} {fmt(avg_intra):=>10} {fmt(avg_gap):>10} {fmt(total):>10}")
print("-" * 57)
print(f"{'Cumulative':<19} {fmt(cum_intra= -= 1):=>10} {fmt(cum_gap - 1):>10} {fmt(cum_total - 1):>10}")
print()
print("Intra-Q = avg cost of fund switching to Q_i during quarter i")
print("Gap = cost during ~45-day gap, with possible Q_i \u2192 Q_{i+1} switch")
print("\u2020 = Q_{i+1} not yet available; gap uses simple Q_i vs Q_{i-1}")
print("Positive = delay hurts the copycat")
```</div></details>
```text
COPYCAT DELAY COST (equity proxy, uniform switch model)
=========================================================
Transition Intra-Q Gap Total
---------------------------------------------------------
Q4_2024 → Q1_2025 +17.74% -16.87% -2.13%
Q1_2025 → Q2_2025 -6.70% -6.35% -12.63%
Q2_2025 → Q3_2025 +12.55% -4.10% +7.93%
Q3_2025 → Q4_2025 † +4.07% +26.07% +31.20%
---------------------------------------------------------
Cumulative +28.66% -5.88% +21.09%
Intra-Q = avg cost of fund switching to Q_i during quarter i
Gap = cost during ~45-day gap, with possible Q_i → Q_{i+1} switch
† = Q_{i+1} not yet available; gap uses simple Q_i vs Q_{i-1}
Positive = delay hurts the copycat
```
For evaluating the copycat strategy in isolation, this analysis adds little: the historical returns already price in these delay costs. Where it is more relevant is in comparing the copycat against the strategy of investing in the fund itself. The delay cost estimates one observable component of the tracking gap; the other limitations discussed [above](/notes/situational-awareness-lp/) remain unobserved and could move the realized gap in either direction.
As noted, the compounded delay cost over four quarterly transitions (approximately one year) is large, and this estimate uses the equity proxy rather than the representative-option model. The true tracking gap between the copycat and the fund could be larger or smaller depending on the fund’s undisclosed option contracts, short positions, foreign-listed securities, non-equity assets, and actual trading path. I would treat the figure as evidence that disclosure lag matters, not as a literal estimate of the fund’s net investor return.
Portfolio calculator {#portfolio-calculator}
The calculator below converts the most recent 13F filing into a concrete trade list. In equity-proxy mode, stock rows are bought as shares, call rows are bought as underlying shares, and put rows are shorted as underlying shares, all in proportion to reported underlying notional. In option-proxy mode, the bankroll is treated as deployed capital: stock rows consume capital directly, while option rows target the 13F underlying notional and consume the estimated premium for the cached representative contract. An optional cutoff drops positions below a given capital percentage and redistributes their weight among the rest. You can also exclude individual rows, or include rows below the cutoff, by selecting the relevant checkboxes.<details><summary>Code</summary><div class="details"><a id="code-snippet--sa-calc"/>
```python
import json
import yfinance as yf
import pandas as pd
from datetime import datetime, timedelta
import numpy as np
import requests
import time
import os
import warnings
warnings.filterwarnings('ignore')
Parse data from the scraper block
parsed = json.loads(data) if isinstance(data, str) else data
filings = parsed["filings"]
Build internal structures
filing_dates = {f["quarter"]: f["filing_date"] for f in filings}
quarter_end_dates = {f["quarter"]: f["quarter_end"] for f in filings}
quarters = [f["quarter"] for f in filings]
Convert holdings list to dict keyed by quarter.
Multiple positions in the same ticker with different types are aggregated
by value per (ticker, type) pair.
holdings = {}
for f in filings:
positions = {}
for h in f["holdings"]:
ticker = h["ticker"]
pos_type = h["type"]
value = h["value"]
key = (ticker, pos_type)
positions[key] = positions.get(key, 0) + value
holdings[f["quarter"]] = positions
def _extract_close_series(df, ticker):
"""Extract a single close-price series from a yfinance result."""
if df.empty:
return pd.Series(dtype=float)
if isinstance(df.columns, pd.MultiIndex):
if 'Close' not in df.columns.get_level_values(0):
return pd.Series(dtype=float)
close = df['Close']
if isinstance(close, pd.DataFrame):
if ticker in close.columns:
series = close[ticker]
elif len(close.columns) == 1:
series = close.iloc[:, 0]
else:
return pd.Series(dtype=float)
else:
series = close
elif 'Close' in df.columns:
series = df['Close']
if isinstance(series, pd.DataFrame):
series = series.iloc[:, 0]
else:
return pd.Series(dtype=float)
return pd.to_numeric(series, errors='coerce').dropna()
def _download_close_series(ticker, start, end):
"""Download one ticker's close series; used to repair flaky batch misses."""
df = yf.download(ticker, start=start, end=end, progress=False,
auto_adjust=True)
return _extract_close_series(df, ticker)
def get_prices(tickers, dates):
"""Fetch close prices for tickers on specific dates."""
unique_tickers = sorted(set(tickers))
all_dates = [datetime.strptime(d, '%Y-%m-%d') for d in dates]
start = min(all_dates) - timedelta(days=5)
end = max(all_dates) + timedelta(days=5)
df = yf.download(unique_tickers, start=start, end=end, progress=False, auto_adjust=True)
yf.download returns MultiIndex columns (metric, ticker) for multiple tickers
if df.empty:
close = pd.DataFrame()
elif isinstance(df.columns, pd.MultiIndex) and 'Close' in df.columns.get_level_values(0):
close = df['Close'].copy()
elif 'Close' in df.columns:
close = df[['Close']].copy()
close.columns = unique_tickers
else:
close = pd.DataFrame()
prices = {}
for ticker in unique_tickers:
if ticker in close.columns:
series = pd.to_numeric(close[ticker], errors='coerce').dropna()
else:
series = pd.Series(dtype=float)
if series.empty:
series = _download_close_series(ticker, start, end)
if series.empty:
continue
prices[ticker] = {}
for date_str in dates:
target = pd.Timestamp(datetime.strptime(date_str, '%Y-%m-%d'))
after = series[series.index >= target]
if not after.empty:
prices[ticker][date_str] = float(after.iloc[0])
else:
before = series[series.index<= target]= if= not= before.empty:= prices[ticker][date_str]=float(before.iloc[-1]) return= prices= def= _price_on_or_after(px_by_date,= target_date):= """Return= (date,= price)= for= the= first= available= price= on/after= target."""= if= not= px_by_date:= return= None= dates=sorted(d for= d= in= px_by_date= if= d=>= target_date)
if not dates:
return None
d = dates[0]
return d, px_by_date[d]
def _price_on_or_before(px_by_date, target_date):
"""Return (date, price) for the last available price on/before target."""
if not px_by_date:
return None
dates = sorted(d for d in px_by_date if d<= target_date)= if= not= dates:= return= None= d=dates[-1] return= d,= px_by_date[d]= def= _period_price_pair(px_by_date,= start_date,= end_date):= """Return= start/end= prices= for= a= period= using= sensible= boundary= alignment."""= start=_price_on_or_after(px_by_date, start_date)= end=_price_on_or_before(px_by_date, end_date)= if= start= is= None= or= end= is= None:= return= None= start_actual,= p0=start end_actual,= p1=end if= end_actual= <= start_actual:= return= None= return= start_actual,= end_actual,= p0,= p1= def= _option_position_key(ticker,= pos_type):= return= (ticker,= pos_type)= def= _linear_underlying_sign(pos_type):= """Direction= when= option= rows= are= converted= to= underlying= equity= exposure."""= return= -1= if= pos_type== 'put'= else= 1= def= compute_return(positions,= prices,= start_date,= end_date,= mode='equity_only' ,= option_prices=None): """Compute= portfolio= return= between= two= dates.= The= 13F= value= for= an= option= row= is= treated= as= underlying= notional,= not= option= premium.= Option= contracts= are= sized= from= that= notional,= but= the= portfolio= denominator= is= estimated= deployed= capital:= stock= value= plus= option= premium= cost.= This= avoids= treating= the= gap= between= option= notional= and= option= premium= as= cash.= In= 'full'= mode,= every= option= row= requires= a= MarketData= price= series;= missing= data= raises= rather= than= falling= back.= """= total_cost=0 portfolio_pnl=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= stock_px=prices.get(ticker) if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= total_cost= +=value portfolio_pnl= +=value *= _linear_underlying_sign(pos_type)= *= stock_ret= continue= if= is_option:= opt_key=_option_position_key(ticker, pos_type)= opt_px=option_prices.get(opt_key) if= option_prices= else= None= if= not= opt_px:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= period= "= f"{start_date}..{end_date}")= pair=_period_price_pair(opt_px, start_date,= end_date)= if= pair= is= None:= raise= RuntimeError(= f"MarketData= option= price= series= for= {opt_key}= does= not= "= f"cover= {start_date}..{end_date}")= start_actual,= end_actual,= opt_p0,= opt_p1=pair stock_start=_price_on_or_after(stock_px, start_actual)= if= stock_start= is= None= or= stock_start[1]= <=0: stock_start=_price_on_or_after(stock_px, start_date)= if= stock_start= is= None= or= stock_start[1]= <=0: raise= RuntimeError(= f"No= underlying= price= for= {ticker}= at= {start_date}")= p0,= p1=opt_p0, opt_p1= underlying_p0=stock_start[1] if= p0= <=0 or= underlying_p0= <=0: continue= position_cost=value *= (p0= /= underlying_p0)= position_pnl=value *= ((p1= -= p0)= /= underlying_p0)= else:= pair=_period_price_pair(stock_px, start_date,= end_date)= if= pair= is= None:= continue= start_actual,= end_actual,= p0,= p1=pair if= p0== 0:= continue= stock_ret=(p1 -= p0)= /= p0= position_cost=value position_pnl=value *= stock_ret= if= position_cost= <=0: continue= total_cost= +=position_cost portfolio_pnl= +=position_pnl return= portfolio_pnl= /= total_cost= if= total_cost= else= None= def= annualize(ret,= days):= """Annualize= a= return= over= a= given= number= of= calendar= days."""= if= ret= is= None= or= days= <=0: return= None= return= (1= += ret)= **= (365.25= /= days)= -= 1= def= fmt(ret):= return= f"{ret= *= 100:+.2f}%"= if= ret= is= not= None= else= "N/A"= Collect= all= tickers= and= dates= all_tickers=set() for= positions= in= holdings.values():= for= (ticker,= _)= in= positions:= all_tickers.add(ticker)= all_tickers.add('SPY')= today=datetime.now().strftime('%Y-%m-%d') first_date=filing_dates[quarters[0]] all_dates=set(filing_dates.values()) |= set(quarter_end_dates.values())= |= {today}= prices=get_prices(sorted(all_tickers), sorted(all_dates))= Resolve= `today`= to= the= actual= last= available= closing= date.= yfinance= may= not= have= data= for= today= (market= still= open= or= holiday),= so= we= look= up= what= date= SPY's= price= actually= corresponds= to.= def= _resolve_price_date(prices,= requested_date):= """Return= the= actual= trading= date= of= the= price= stored= under= requested_date."""= ref='SPY' if= 'SPY'= in= prices= else= next(iter(prices),= None)= if= not= ref= or= requested_date= not= in= prices[ref]:= return= requested_date= target_price=prices[ref][requested_date] Re-download= a= small= window= to= find= the= real= date= of= this= price= start=datetime.strptime(requested_date, '%Y-%m-%d')= -= timedelta(days=10) end=datetime.strptime(requested_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(ref, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= return= requested_date= if= isinstance(df.columns,= pd.MultiIndex):= close=df['Close'][ref].dropna() elif= 'Close'= in= df.columns:= close=df['Close'].dropna() else:= close=df.iloc[:, 0].dropna()= for= dt,= px= in= close.items():= val=float(px.iloc[0]) if= isinstance(px,= pd.Series)= else= float(px)= if= abs(val= -= target_price)= <= 0.01:= ts=dt[0] if= isinstance(dt,= tuple)= else= dt= return= pd.Timestamp(ts).strftime('%Y-%m-%d')= return= requested_date= today_resolved=_resolve_price_date(prices, today)= if= today_resolved= !=today: for= ticker= in= prices:= if= today= in= prices[ticker]:= prices[ticker][today_resolved]=prices[ticker].pop(today) today=today_resolved def= download_daily(tickers,= start_date,= end_date):= """Download= daily= close= prices= from= yfinance,= handling= MultiIndex.= Dates= are= 'YYYY-MM-DD'= strings.= Adds= a= small= buffer= for= trading-day= alignment."""= tickers_sorted=sorted(tickers) start=datetime.strptime(start_date, '%Y-%m-%d')= -= timedelta(days=5) end=datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=5) df=yf.download(tickers_sorted, start=start, end=end, progress=False, auto_adjust=True) if= df.empty:= close=pd.DataFrame() elif= isinstance(df.columns,= pd.MultiIndex)= and= 'Close'= in= df.columns.get_level_values(0):= close=df['Close'].copy() elif= 'Close'= in= df.columns:= close=df[['Close']].copy() close.columns=tickers_sorted else:= close=pd.DataFrame() for= ticker= in= tickers_sorted:= if= ticker= in= close.columns= and= not= close[ticker].dropna().empty:= continue= series=_download_close_series(ticker, start,= end)= if= not= series.empty:= close[ticker]=series return= close.sort_index()= --= Historical= option= prices= via= MarketData= --------------------------------= OPTION_CACHE_DIR=os.path.expanduser('~/My Drive/notes/.sa-lp-option-cache')= _MD_BASE='https://api.marketdata.app/v1' _MD_RATE_DELAY=0.15 OPTION_CACHE_COLUMNS=[ 'date',= 'selected_on',= 'option_type',= 'symbol',= 'strike',= 'expiry',= 'delta',= 'price']= Default= contract= selection= parameters.= The= option= proxy= picks= a= contract= matching= option= type,= with= expiry= between= min_days= and= max_days= of= the= period= start,= and= |delta|= closest= to= delta_target.= When= the= chain= is= sparse,= the= achieved= |delta|= may= be= far= from= the= target;= the= sensitivity= block= reports= achieved= |delta|= so= this= is= visible= rather= than= silent.= OPTION_DELTA=0.15 EXPIRY_MIN_DAYS=270 #= ~9= months= EXPIRY_MAX_DAYS=456 #= ~15= months= def= _normalize_option_type(option_type):= option_type=str(option_type).lower() if= option_type= not= in= ('call',= 'put'):= raise= ValueError(f"Unsupported= option= type:= {option_type}")= return= option_type= def= _empty_option_cache():= return= pd.DataFrame(columns=OPTION_CACHE_COLUMNS) def= _option_cache_path(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Return= the= cache= CSV= path= for= (ticker,= type,= delta_target,= window).= When= the= parameter= triple= equals= the= baseline= (0.15,= 270-456= days),= the= historical= filename= ``TICKER-TYPE.csv``= is= used= so= the= main-backtest= cache= is= reused= automatically.= Any= non-baseline= combo= lives= in= a= separate= ``TICKER-TYPE-d<delta=>-e<min>-<max>.csv`` file so a sensitivity
sweep never pollutes the baseline cache (which the portfolio calculator
reads to pick the representative contract for the current filing).
"""
option_type = _normalize_option_type(option_type)
is_baseline = (
abs(delta_target - OPTION_DELTA)< 1e-9= and= min_days== EXPIRY_MIN_DAYS= and= max_days== EXPIRY_MAX_DAYS)= if= is_baseline:= return= os.path.join(OPTION_CACHE_DIR,= f'{ticker}-{option_type}.csv')= return= os.path.join(= OPTION_CACHE_DIR,= f'{ticker}-{option_type}-d{delta_target:g}-e{min_days}-{max_days}.csv')= def= _load_option_cache(ticker,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Load= cached= MarketData= rows= for= a= ticker/type/target/window.= Returns= DataFrame= or= empty."""= option_type=_normalize_option_type(option_type) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= not= os.path.exists(path):= return= _empty_option_cache()= df=pd.read_csv(path) if= df.empty:= return= _empty_option_cache()= for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan for= col= in= ('date',= 'selected_on'):= df[col]=pd.to_datetime( df[col],= errors='coerce' ).dt.strftime('%Y-%m-%d')= df['option_type']=df['option_type'].fillna(option_type).str.lower() cache=df[OPTION_CACHE_COLUMNS].copy() cache=cache[cache['option_type'] == option_type].copy()= cache.dropna(subset=['date'], inplace=True) for= col= in= ('strike',= 'delta',= 'price'):= cache[col]=pd.to_numeric(cache[col], errors='coerce' )= cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) return= cache[OPTION_CACHE_COLUMNS]= def= _save_option_cache(ticker,= option_type,= df,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Persist= typed= option= cache= to= CSV."""= option_type=_normalize_option_type(option_type) os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) path=_option_cache_path(ticker, option_type,= delta_target,= min_days,= max_days)= if= df.empty:= df=_empty_option_cache() else:= df=df.copy() df['option_type']=option_type for= col= in= OPTION_CACHE_COLUMNS:= if= col= not= in= df.columns:= df[col]=np.nan df.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) df.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) df.to_csv(path,= index=False) def= _contract_window(ref_date_str,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= return= ref= += timedelta(days=min_days), ref= += timedelta(days=max_days) def= _contract_from_cache_row(row,= ref_date_str,= option_type,= min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) if= str(row.get('option_type',= option_type)).lower()= !=option_type: return= None= lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= try:= exp=datetime.strptime(str(row['expiry']), '%Y-%m-%d')= except= (KeyError,= TypeError,= ValueError):= return= None= if= not= (lo= <=exp <=hi): return= None= strike=_safe_float(row.get('strike')) delta=_safe_float(row.get('delta')) price=_safe_float(row.get('price')) if= strike= is= None= or= delta= is= None= or= price= is= None= or= price= <=0: return= None= return= {= 'selected_on':= row.get('selected_on'),= 'option_type':= option_type,= 'symbol':= row.get('symbol'),= 'strike':= strike,= 'expiry':= str(row['expiry']),= 'delta':= delta,= 'price':= price,= }= def= _select_cached_contract(cache,= option_type,= ref_date_str,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS, require_selected=False): rows=cache[(cache['date'] == ref_date_str)= &= (cache['option_type']== option_type)]= selected_rows=rows[rows['selected_on'] == ref_date_str]= if= not= selected_rows.empty:= rows=selected_rows elif= require_selected:= rows=selected_rows candidates=[] for= _,= row= in= rows.iterrows():= contract=_contract_from_cache_row(row, ref_date_str,= option_type,= min_days,= max_days)= if= contract:= candidates.append(contract)= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= _parse_option_price(contract):= """Extract= a= mark= price= from= an= option= contract= record."""= mid=_safe_float(contract.get('mid')) if= mid= and= mid=> 0:
return mid
bid = _safe_float(contract.get('bid'))
ask = _safe_float(contract.get('ask'))
last = _safe_float(contract.get('last'))
if bid and ask and bid > 0 and ask > 0:
return (bid + ask) / 2
if last and last > 0:
return last
return None
def _safe_float(val):
try:
out = float(val)
if np.isnan(out):
return None
return out
except (TypeError, ValueError):
return None
def _marketdata_key():
"""Return the MarketData API key, or None if unavailable.
Resolution order:
1. ``MARKETDATA_KEY`` / ``MARKETDATA_API_KEY`` environment variables.
2. ``pass env/marketdata-token`` (local ``pass`` store).
The result is memoised on the function object so repeated lookups
during a sweep do not reshell. Fetch helpers raise themselves when
called without a key, so a fully cached run still succeeds without
requiring either source.
"""
if hasattr(_marketdata_key, '_cached'):
return _marketdata_key._cached
key = (os.environ.get('MARKETDATA_KEY', '')
or os.environ.get('MARKETDATA_API_KEY', ''))
if not key:
try:
import subprocess
out = subprocess.run(
['pass', 'show', 'env/marketdata-token'],
capture_output=True, text=True, timeout=5, check=False)
if out.returncode == 0:
key = out.stdout.strip().splitlines()[0] if out.stdout else ''
except (FileNotFoundError, subprocess.TimeoutExpired):
key = ''
_marketdata_key._cached = key or None
return _marketdata_key._cached
def _marketdata_get(path, params, api_key):
"""Fetch a MarketData endpoint, returning normalized row dictionaries.
Raises on HTTP errors or a non-'ok' status. 'no_data' is returned as
an empty list so that callers can distinguish 'nothing available' from
'request failed'.
"""
headers = {'Accept': 'application/json', 'Authorization': f'Bearer {api_key}'}
resp = requests.get(_MD_BASE + path, params=params, headers=headers,
timeout=30)
resp.raise_for_status()
body = resp.json()
status = body.get('s')
if status == 'no_data':
return []
if status != 'ok':
raise RuntimeError(
f"MarketData {path} returned status={status!r}: "
f"{body.get('errmsg') or body}")
lengths = [len(v) for v in body.values() if isinstance(v, list)]
n = max(lengths) if lengths else 0
rows = []
for i in range(n):
row = {}
for key, val in body.items():
if isinstance(val, list):
row[key] = val[i] if i< len(val)= else= None= else:= row[key]=val rows.append(row)= return= rows= def= _marketdata_date(timestamp):= try:= return= datetime.utcfromtimestamp(int(timestamp)).strftime('%Y-%m-%d')= except= (TypeError,= ValueError,= OSError):= return= None= def= _occ_symbol(ticker,= option_type,= strike,= expiry):= """Build= a= standard= OCC= option= symbol= from= contract= fields."""= cp='C' if= _normalize_option_type(option_type)== 'call'= else= 'P'= exp=datetime.strptime(str(expiry), '%Y-%m-%d').strftime('%y%m%d')= strike_int=int(round(float(strike) *= 1000))= root=ticker.upper().replace('.', '')= return= f'{root}{exp}{cp}{strike_int:08d}'= Chains= are= always= fetched= with= a= broad= expiry= window= so= they= can= be= cached= and= reused= for= in-memory= selection= across= any= (delta_target,= expiry= window)= combination= in= the= sensitivity= sweep.= CHAIN_FETCH_MIN_DAYS=30 CHAIN_FETCH_MAX_DAYS=760 def= _fetch_marketdata_chain(ticker,= date_str,= option_type,= api_key,= min_days=CHAIN_FETCH_MIN_DAYS, max_days=CHAIN_FETCH_MAX_DAYS): lo,= hi=_contract_window(date_str, min_days,= max_days)= params={ 'date':= date_str,= 'from':= lo.strftime('%Y-%m-%d'),= 'to':= hi.strftime('%Y-%m-%d'),= 'side':= _normalize_option_type(option_type),= 'expiration':= 'all',= }= return= _marketdata_get(f'/options/chain/{ticker}/',= params,= api_key)= Chain= cache:= one= CSV= per= (ticker,= type,= date)= storing= the= broad-window= chain.= Lets= the= sensitivity= sweep= re-select= contracts= for= different= delta= targets= and= expiry= windows= without= refetching.= CHAIN_CACHE_DIR=os.path.join(OPTION_CACHE_DIR, 'chains')= def= _chain_cache_path(ticker,= option_type,= date_str):= option_type=_normalize_option_type(option_type) return= os.path.join(CHAIN_CACHE_DIR,= f'{ticker}-{option_type}-{date_str}.csv')= def= _load_chain_cache(ticker,= option_type,= date_str):= path=_chain_cache_path(ticker, option_type,= date_str)= if= not= os.path.exists(path):= return= None= df=pd.read_csv(path) if= df.empty:= return= []= return= df.to_dict('records')= def= _save_chain_cache(ticker,= option_type,= date_str,= chain):= if= not= chain:= return= os.makedirs(CHAIN_CACHE_DIR,= exist_ok=True) path=_chain_cache_path(ticker, option_type,= date_str)= pd.DataFrame(chain).to_csv(path,= index=False) def= _get_or_fetch_chain(ticker,= date_str,= option_type,= api_key,= fetched_counter=None): """Return= the= cached= broad= chain= for= (ticker,= type,= date),= fetching= if= absent.= Requires= ``api_key``= only= when= a= fetch= is= actually= needed.= """= chain=_load_chain_cache(ticker, option_type,= date_str)= if= chain= is= not= None:= return= chain= if= not= api_key:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= but= a= chain= fetch= is= required= for= "= f"{ticker}= {option_type}= on= {date_str}.")= time.sleep(_MD_RATE_DELAY)= chain=_fetch_marketdata_chain(ticker, date_str,= option_type,= api_key)= if= fetched_counter= is= not= None:= fetched_counter['marketdata_chains']= +=1 _save_chain_cache(ticker,= option_type,= date_str,= chain)= return= chain= def= _fetch_marketdata_quotes(symbol,= start_date,= end_date,= api_key):= to_date=(datetime.strptime(end_date, '%Y-%m-%d')= += timedelta(days=1)).strftime('%Y-%m-%d') rows=_marketdata_get(f'/options/quotes/{symbol}/', {'from':= start_date,= 'to':= to_date},= api_key)= prices={} for= row= in= rows:= date_str=_marketdata_date(row.get('updated')) if= not= date_str:= continue= price=_parse_option_price(row) if= price= is= not= None= and= price=> 0:
prices[date_str] = price
return prices
def _implied_vol_from_price(S, K, T, option_price, option_type):
"""Infer Black-Scholes volatility from an observed option mid price."""
if any(x is None for x in (S, K, T, option_price)):
return None
if S<= 0= or= K= <=0 or= T= <=0 or= option_price= <=0: return= None= intrinsic=max(S -= K,= 0)= if= option_type== 'call'= else= max(K= -= S,= 0)= upper=S if= option_type== 'call'= else= K= if= option_price= <= intrinsic= -= 1e-6= or= option_price=> upper * 1.5:
return None
lo, hi = 1e-4, 5.0
try:
if (option_price< bs_price(S,= K,= T,= lo,= option_type)= -= 1e-4= or= option_price=> bs_price(S, K, T, hi, option_type) + 1e-4):
return None
for _ in range(80):
mid = (lo + hi) / 2
if bs_price(S, K, T, mid, option_type)< option_price:= lo=mid else:= hi=mid return= (lo= += hi)= /= 2= except= (FloatingPointError,= ValueError,= ZeroDivisionError):= return= None= def= _marketdata_delta(row,= ref_date_str,= expiry,= option_type,= price):= """Use= vendor= delta= when= present;= otherwise= infer= it= from= the= quote."""= native=_safe_float(row.get('delta')) if= native= is= not= None= and= native= !=0: return= native= S=_safe_float(row.get('underlyingPrice')) K=_safe_float(row.get('strike')) ref=datetime.strptime(ref_date_str, '%Y-%m-%d')= exp=datetime.strptime(expiry, '%Y-%m-%d')= T=max((exp -= ref).days= /= 365.25,= 1e-6)= sigma=_safe_float(row.get('iv')) if= sigma= is= None= or= sigma= <=0: sigma=_implied_vol_from_price(S, K,= T,= price,= option_type)= if= S= is= None= or= K= is= None= or= sigma= is= None= or= sigma= <=0: return= None= return= bs_delta(S,= K,= T,= sigma,= option_type)= def= _select_marketdata_contract(chain,= ref_date_str,= option_type,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): option_type=_normalize_option_type(option_type) lo,= hi=_contract_window(ref_date_str, min_days,= max_days)= candidates=[] for= c= in= chain:= if= str(c.get('side',= '')).lower()= !=option_type: continue= expiry=_marketdata_date(c.get('expiration')) if= not= expiry:= continue= exp=datetime.strptime(expiry, '%Y-%m-%d')= if= not= (lo= <=exp <=hi): continue= price=_parse_option_price(c) if= price= is= None= or= price= <=0: continue= delta=_marketdata_delta(c, ref_date_str,= expiry,= option_type,= price)= if= delta= is= None= or= delta== 0:= continue= strike=_safe_float(c.get('strike')) symbol=c.get('optionSymbol') if= strike= is= None= or= not= symbol:= continue= candidates.append({= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= delta,= 'price':= price,= })= if= not= candidates:= return= None= candidates.sort(key=lambda x:= abs(abs(x['delta'])= -= delta_target))= return= candidates[0]= def= download_option_prices(option_positions,= quarters,= holdings,= filing_dates,= today,= delta_target=OPTION_DELTA, min_days=EXPIRY_MIN_DAYS, max_days=EXPIRY_MAX_DAYS): """Download= historical= representative= option= prices= from= MarketData.= MarketData= is= the= sole= supported= provider.= MARKETDATA_KEY= must= be= set.= For= each= (ticker,= option_type)= and= each= filing= period= in= which= that= position= is= held:= 1.= On= the= first= trading= day,= select= a= contract= matching= type,= with= expiry= between= ``min_days``= and= ``max_days``= of= the= period= start,= and= |delta|= closest= to= ``delta_target``.= MarketData's= Starter= plan= often= returns= null= Greeks,= so= delta= is= inferred= from= the= observed= mid= price= via= Black-Scholes= when= the= vendor= delta= is= missing.= 2.= Lock= in= that= contract= for= the= period.= 3.= Track= its= historical= mid= price= through= the= period.= The= broad= option= chain= for= each= (ticker,= type,= first_day)= is= cached= to= disk= so= that= sensitivity= sweeps= over= (delta_target,= expiry= window)= reuse= a= single= fetch.= Raises= ``RuntimeError``= if= no= suitable= contract= can= be= selected= for= any= required= (ticker,= type,= period),= or= if= MarketData= returns= no= price= series= for= the= selected= contract.= Parameters= ----------= delta_target= := float= Target= |delta|= for= contract= selection= (default= ``OPTION_DELTA``).= min_days,= max_days= := int= Contract= expiry= window= in= days= from= period= start= (default= 270-456,= i.e.= 9-15= months).= Returns= -------= per_period= := dict= {quarter_str:= {(ticker,= type):= {date_str:= float}}}= Option= prices= keyed= by= filing= period= then= option= position.= Each= period= has= its= own= contract's= prices.= """= option_positions=sorted({ (ticker,= _normalize_option_type(pos_type))= for= ticker,= pos_type= in= option_positions})= md_key=_marketdata_key() os.makedirs(OPTION_CACHE_DIR,= exist_ok=True) per_period={} #= {q:= {(ticker,= type):= {date_str:= price}}}= fetched={'marketdata_chains': 0,= 'marketdata_quotes':= 0}= for= ticker,= option_type= in= option_positions:= opt_key=_option_position_key(ticker, option_type)= cache=_load_option_cache(ticker, option_type,= delta_target,= min_days,= max_days)= new_rows=[] for= i,= q= in= enumerate(quarters):= Skip= quarters= where= this= exact= option= position= is= absent.= if= opt_key= not= in= holdings[q]:= continue= period_start=filing_dates[q] period_end=(filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today)= trading_days=pd.bdate_range(period_start, period_end)= if= len(trading_days)== 0:= continue= first_day=trading_days[0].strftime('%Y-%m-%d') --= Select= contract= on= first= trading= day= --= contract=_select_cached_contract( cache,= option_type,= first_day,= delta_target=delta_target, min_days=min_days, max_days=max_days, require_selected=True) if= contract= is= None:= chain=_get_or_fetch_chain( ticker,= first_day,= option_type,= md_key,= fetched)= contract=_select_marketdata_contract( chain,= first_day,= option_type,= delta_target=delta_target, min_days=min_days, max_days=max_days) if= contract= is= None:= raise= RuntimeError(= f"MarketData= returned= no= usable= {option_type}= contract= "= f"for= {ticker}= on= {first_day}= (period= {q})= at= "= f"delta={delta_target}, "= f"expiry= {min_days}-{max_days}d")= new_rows.append({= 'date':= first_day,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= contract.get('symbol'),= 'strike':= contract['strike'],= 'expiry':= contract['expiry'],= 'delta':= contract['delta'],= 'price':= contract['price'],= })= strike=contract['strike'] expiry=contract['expiry'] symbol=contract.get('symbol') or= _occ_symbol(= ticker,= option_type,= strike,= expiry)= --= Collect= prices= for= this= period= (fresh= dict= per= period)= --= period_prices={} Fast= path:= read= matching= prices= from= cache.= rows=cache[ (cache['date']=>= period_start)
& (cache['date']<= period_end)= &= (cache['option_type']== option_type)= &= (abs(cache['strike']= -= strike)= <= 0.01)= &= (cache['expiry'].astype(str)== str(expiry))= &= pd.notna(cache['price'])]= selected_rows=rows[rows['selected_on'] == first_day]= if= not= selected_rows.empty:= rows=selected_rows for= _,= row= in= rows.iterrows():= period_prices[row['date']]=float(row['price']) Decide= whether= to= refresh= quotes.= With= a= key,= refresh= whenever= the= cached= series= does= not= reach= period_end.= Without= a= key,= only= fail= if= the= cached= series= is= empty;= a= slightly= stale= tail= is= acceptable= for= cache-only= runs= (e.g.= sensitivity= sweeps= replaying= the= baseline= contract).= has_partial=bool(period_prices) reaches_end=has_partial and= max(period_prices)=>= period_end
if md_key and not reaches_end:
time.sleep(_MD_RATE_DELAY)
quote_prices = _fetch_marketdata_quotes(
symbol, period_start, period_end, md_key)
fetched['marketdata_quotes'] += 1
for day_str, price in quote_prices.items():
if period_start<= day_str= <=period_end: period_prices[day_str]=price new_rows.append({= 'date':= day_str,= 'selected_on':= first_day,= 'option_type':= option_type,= 'symbol':= symbol,= 'strike':= strike,= 'expiry':= expiry,= 'delta':= contract['delta'],= 'price':= price,= })= if= contract.get('price')= and= first_day= not= in= period_prices:= period_prices[first_day]=contract['price'] elif= not= md_key= and= not= has_partial:= raise= RuntimeError(= "MARKETDATA_KEY= is= not= set= and= no= cached= quotes= exist= "= f"for= {symbol}= in= {period_start}..{period_end}.")= if= not= period_prices:= raise= RuntimeError(= f"MarketData= returned= no= quotes= for= {symbol}= "= f"({opt_key})= in= {period_start}..{period_end}")= per_period.setdefault(q,= {})[opt_key]=period_prices Persist= new= data= to= cache= if= new_rows:= new_df=pd.DataFrame(new_rows) cache=pd.concat([cache, new_df],= ignore_index=True) cache.drop_duplicates(= subset=['date', 'selected_on',= 'option_type',= 'strike',= 'expiry'],= keep='last' ,= inplace=True) cache.sort_values(['date',= 'expiry',= 'strike'],= inplace=True) _save_option_cache(ticker,= option_type,= cache,= delta_target,= min_days,= max_days)= if= any(fetched.values()):= import= sys= parts=[] if= fetched['marketdata_chains']:= parts.append(f"{fetched['marketdata_chains']}= MarketData= chains")= if= fetched['marketdata_quotes']:= parts.append(f"{fetched['marketdata_quotes']}= MarketData= quote= series")= print(f"[options]= Fetched= {',= '.join(parts)}",= file=sys.stderr) return= per_period= --= Black-Scholes= helpers= (used= only= to= infer= delta= when= MarketData's= Starter-plan= historical= Greeks= are= null;= never= to= reprice= returns)= -----= from= scipy.stats= import= norm= as= _norm= def= bs_price(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= option= price= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= max(S= -= K,= 0)= return= max(K= -= S,= 0)= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= d2=d1 -= sigma= *= np.sqrt(T)= if= option_type== 'call':= return= S= *= _norm.cdf(d1)= -= K= *= _norm.cdf(d2)= return= K= *= _norm.cdf(-d2)= -= S= *= _norm.cdf(-d1)= def= bs_delta(S,= K,= T,= sigma,= option_type='call' ):= """Black-Scholes= delta= (assumes= zero= risk-free= rate= and= dividends)."""= if= T= <=0 or= sigma= <=0: if= option_type== 'call':= return= 1.0= if= S=> K else 0.0
return -1.0 if S< K= else= 0.0= d1=(np.log(S /= K)= += (sigma= **= 2= /= 2)= *= T)= /= (sigma= *= np.sqrt(T))= if= option_type== 'call':= return= _norm.cdf(d1)= return= _norm.cdf(d1)= -= 1= def= daily_cumulative(holdings,= quarters,= filing_dates,= close,= today,= mode,= per_period_opt=None): """Build= a= daily= series= of= cumulative= growth= factors= for= a= given= mode.= For= each= filing= period,= stock= shares= and= option= contracts= are= fixed.= In= equity-proxy= mode,= option= rows= are= converted= to= linear= underlying= exposure:= calls= are= long= underlying= and= puts= are= short= underlying.= In= option-proxy= mode,= option= rows= are= sized= by= 13F= underlying= notional= and= returns= come= from= MarketData= quotes;= returns= are= divided= by= deployed= capital= (stock= value= plus= option= premium= cost).= Option-proxy= mode= raises= if= MarketData= prices= are= missing= for= any= required= position.= """= cum_growth=1.0 dates_out=[] values_out=[] for= i,= q= in= enumerate(quarters):= period_start=filing_dates[q] period_end=filing_dates[quarters[i += 1]]= if= i= <= len(quarters)= -= 1= else= today= ps=pd.Timestamp(period_start) pe=pd.Timestamp(period_end) Trading= days= in= this= period= mask=(close.index>= ps) & (close.index<= pe)= period_close=close[mask] if= period_close.empty:= continue= Option= prices= for= this= period= (keyed= by= (ticker,= type)= →= prices)= quarter_opt=per_period_opt.get(q, {})= if= per_period_opt= else= {}= Determine= starting= prices,= fixed= exposure,= and= deployed= capital.= positions=holdings[q] exposure={} costs={} start_prices={} start_underlying={} use_opt_px={} #= track= which= positions= use= option= prices= total_cost=0 for= (ticker,= pos_type),= value= in= positions.items():= is_option=pos_type in= ('call',= 'put')= opt_key=_option_position_key(ticker, pos_type)= if= mode== 'equity_only':= if= pos_type= not= in= ('long',= 'call',= 'put'):= continue= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value continue= mode== 'full'= (option= proxy)= if= is_option:= if= opt_key= not= in= quarter_opt:= raise= RuntimeError(= f"No= MarketData= option= prices= for= {opt_key}= in= "= f"period= {q}")= ticker_opt=quarter_opt[opt_key] opt_dates=sorted(d for= d= in= ticker_opt= if= d=>= period_start)
if not opt_dates:
raise RuntimeError(
f"MarketData option prices for {opt_key} in period "
f"{q} contain no dates at or after {period_start}")
if ticker not in close.columns:
raise RuntimeError(
f"No underlying close series for {ticker}")
src = close[ticker].dropna()
avail = src[src.index >= ps]
if avail.empty:
raise RuntimeError(
f"No underlying price for {ticker} at {period_start}")
opt_start = ticker_opt[opt_dates[0]]
underlying_start = float(avail.iloc[0])
if opt_start<= 0= or= underlying_start= <=0: raise= RuntimeError(= f"Non-positive= starting= price= for= {opt_key}= in= "= f"period= {q}")= start_prices[(ticker,= pos_type)]=opt_start start_underlying[(ticker,= pos_type)]=underlying_start costs[(ticker,= pos_type)]=value *= opt_start= /= underlying_start= exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=True total_cost= +=costs[(ticker, pos_type)]= continue= Plain= stock= in= full= mode= if= ticker= not= in= close.columns:= continue= src=close[ticker].dropna() avail=src[src.index>= ps]
if avail.empty:
continue
stock_start = float(avail.iloc[0])
if stock_start<= 0:= continue= start_prices[(ticker,= pos_type)]=stock_start start_underlying[(ticker,= pos_type)]=stock_start costs[(ticker,= pos_type)]=value exposure[(ticker,= pos_type)]=value use_opt_px[(ticker,= pos_type)]=False total_cost= +=value if= total_cost== 0:= continue= Daily= P&L= relative= to= period= start.= Skip= first= day= of= subsequent= periods= (already= recorded= as= last= day= of= the= prior= period)= to= avoid= duplicate= boundary= dates.= start_idx=1 if= i=> 0 else 0
Forward-fill: track last known option price so that gaps in
option data don't cause positions to vanish mid-period.
last_opt = {k: v for k, v in start_prices.items()
if use_opt_px.get(k)}
for day_idx in range(start_idx, len(period_close)):
day = period_close.index[day_idx]
day_str = day.strftime('%Y-%m-%d')
period_pnl = 0
for (ticker, pos_type), value in exposure.items():
p0 = start_prices[(ticker, pos_type)]
if p0 == 0:
continue
if use_opt_px[(ticker, pos_type)]:
opt_key = _option_position_key(ticker, pos_type)
p1_val = quarter_opt.get(opt_key, {}).get(day_str)
if p1_val is not None:
last_opt[(ticker, pos_type)] = p1_val
else:
p1_val = last_opt.get((ticker, pos_type))
if p1_val is None:
continue
underlying_p0 = start_underlying.get((ticker, pos_type))
if not underlying_p0 or underlying_p0<= 0:= continue= position_pnl=value *= (float(p1_val)= -= p0)= /= underlying_p0= else:= if= ticker= not= in= period_close.columns:= continue= p1_val=period_close[ticker].iloc[day_idx] if= pd.isna(p1_val):= continue= stock_ret=(float(p1_val) -= p0)= /= p0= if= mode== 'equity_only':= position_pnl=( value= *= _linear_underlying_sign(pos_type)= *= stock_ret)= else:= position_pnl=value *= stock_ret= period_pnl= +=position_pnl dates_out.append(day)= values_out.append(cum_growth= *= (1= += period_pnl= /= total_cost))= Chain:= next= period= starts= from= the= last= day's= growth= factor= if= values_out:= cum_growth=values_out[-1] return= dates_out,= values_out= import= os= HUGO_BASE=os.path.expanduser('~/My Drive/repos/stafforini.com')= --= Build= position= data= for= both= modes= --------------------------------= latest=parsed["filings"][-1] pos={} for= h= in= latest["holdings"]:= key=(h["ticker"], h["type"])= pos[key]=pos.get(key, 0)= += h["value"]= eq_pos=pos Fetch= current= underlying= prices= for= all= rows= calc_tickers=sorted({t for= (t,= _)= in= pos})= current=get_prices(calc_tickers, [today])= Load= option= contract= info= for= the= latest= quarter.= The= baseline= cache= may= legitimately= hold= more= than= one= contract= per= (ticker,= selected_on)= when= the= sensitivity= sweep's= expiry= windows= overlap= with= the= baseline= 9-15m= window.= Pick= the= one= whose= |delta|= is= closest= to= OPTION_DELTA= (matching= _select_cached_contract's= tie-breaking= logic)= so= the= calculator= is= deterministic= and= stays= consistent= with= the= main= backtest's= representative= selection.= latest_fd=latest["filing_date"] opt_contracts={} for= h= in= latest["holdings"]:= if= h["type"]= in= ('call',= 'put'):= key=(h["ticker"], h["type"])= if= key= in= opt_contracts:= continue= cache=_load_option_cache(h["ticker"], h["type"])= selected_rows=cache[(cache['selected_on'] == latest_fd)= &= (cache['option_type']== h["type"])= &= pd.notna(cache['delta'])= &= pd.notna(cache['strike'])= &= pd.notna(cache['price'])]= if= selected_rows.empty:= continue= Identify= the= canonical= contract= (closest= to= baseline= delta),= then= pull= its= most= recent= price= from= the= cache.= canonical=selected_rows.iloc[ (selected_rows['delta'].abs()= -= OPTION_DELTA).abs().argsort()= ].iloc[0]= strike=float(canonical['strike']) expiry=str(canonical['expiry']) price_rows=cache[(cache['option_type'] == h["type"])= &= (abs(cache['strike']= -= strike)= <= 0.01)= &= (cache['expiry'].astype(str)== expiry)= &= pd.notna(cache['price'])= &= (cache['date']=>= latest_fd)]
if price_rows.empty:
continue
latest_row = price_rows.sort_values('date').iloc[-1]
opt_contracts[key] = {
'strike': strike,
'expiry': expiry,
'price': round(float(latest_row['price']), 2),
'price_as_of': str(latest_row['date']),
}
Build JSON data for both modes. In equity-proxy mode, option rows become
linear underlying exposure. In option-proxy mode, reported option value is
underlying notional; capital_basis estimates the deployed premium.
def build_mode_data(positions, option_proxy=False):
rows = []
for (ticker, pos_type), value in sorted(positions.items(),
key=lambda x: -x[1]):
underlying_price = None
row = {"ticker": ticker, "type": pos_type,
"reported_value": round(value, 2)}
if ticker in current and today in current[ticker]:
underlying_price = round(current[ticker][today], 2)
if pos_type == 'long' or (pos_type in ('call', 'put')
and not option_proxy):
direction = 'short' if pos_type == 'put' else 'long'
row.update({"instrument": "stock", "price": underlying_price,
"underlying_price": underlying_price,
"multiplier": 1, "direction": direction})
capital_basis = value
elif pos_type in ('call', 'put'):
row.update({"instrument": "option", "price": None,
"underlying_price": underlying_price,
"multiplier": 100, "direction": "long"})
if (ticker, pos_type) in opt_contracts:
row.update(opt_contracts[(ticker, pos_type)])
option_price = row.get("price")
if (option_price is not None and underlying_price is not None
and option_price > 0 and underlying_price > 0):
capital_basis = value * option_price / underlying_price
else:
capital_basis = None
else:
row.update({"instrument": "stock", "price": underlying_price,
"underlying_price": underlying_price,
"multiplier": 1, "direction": "long"})
capital_basis = value
row["capital_basis"] = (
round(capital_basis, 6) if capital_basis is not None else None)
rows.append(row)
total_basis = sum(r["capital_basis"] for r in rows
if r["capital_basis"] is not None)
for row in rows:
if row["capital_basis"] is not None and total_basis > 0:
row["weight"] = round(row["capital_basis"] / total_basis, 6)
else:
row["weight"] = 0
return rows
eq_data = build_mode_data(eq_pos, option_proxy=False)
full_data = build_mode_data(pos, option_proxy=True)
quarter = latest["quarter"].replace("_", " ")
filing_date = latest["filing_date"]
-- Generate self-contained HTML --------------------------------------
CSS = (
'/* reset */ * { margin: 0; padding: 0; box-sizing: border-box; }\n'
'body { font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto,\n'
' sans-serif; font-size: 14px; background: transparent;\n'
' color: #333; padding: 16px 0; }\n'
'.controls { display: flex; gap: 16px; align-items: center;\n'
' flex-wrap: wrap; margin-bottom: 12px; }\n'
'.controls label { font-weight: 600; font-size: 13px; }\n'
'.controls input, .controls select {\n'
' padding: 6px 10px; border: 1px solid #ccc; border-radius: 4px;\n'
' font-size: 14px; background: #fff; color: #333; }\n'
'.controls input { width: 140px; }\n'
'.meta { font-size: 12px; color: #888; margin-bottom: 12px; }\n'
'.muted { color: #888; font-size: 11px; }\n'
'table { width: 100%; border-collapse: collapse; font-size: 13px;\n'
' font-variant-numeric: tabular-nums; }\n'
'th { text-align: left; padding: 6px 10px; border-bottom: 2px solid #ddd;\n'
' font-weight: 600; font-size: 12px; text-transform: uppercase;\n'
' letter-spacing: 0.03em; color: #666; }\n'
'th.r, td.r { text-align: right; }\n'
'td { padding: 5px 10px; border-bottom: 1px solid #eee; }\n'
'tr:hover td { background: rgba(0,0,0,0.02); }\n'
'.tag { display: inline-block; padding: 1px 6px; border-radius: 3px;\n'
' font-size: 11px; font-weight: 600; }\n'
'.tag-long { background: #dcfce7; color: #166534; }\n'
'.tag-call { background: #dbeafe; color: #1e40af; }\n'
'.tag-put { background: #fee2e2; color: #991b1b; }\n'
'.summary { margin-top: 12px; font-size: 13px; display: flex;\n'
' gap: 24px; font-weight: 500; }\n'
'.summary span { color: #666; font-weight: 400; }\n'
'td.cb { width: 24px; text-align: center; }\n'
'td.cb input { margin: 0; cursor: pointer; }\n'
'tr.excluded td { opacity: 0.35; }\n'
'tr.excluded td.cb { opacity: 1; }\n'
'body.dark { color: #d4d4d4; }\n'
'body.dark .controls input, body.dark .controls select {\n'
' background: #2a2a2a; color: #d4d4d4; border-color: #555; }\n'
'body.dark th { color: #999; border-bottom-color: #444; }\n'
'body.dark td { border-bottom-color: #333; }\n'
'body.dark tr:hover td { background: rgba(255,255,255,0.03); }\n'
'body.dark .tag-long { background: #14532d; color: #86efac; }\n'
'body.dark .tag-call { background: #1e3a5f; color: #93c5fd; }\n'
'body.dark .tag-put { background: #450a0a; color: #fca5a5; }\n'
'body.dark .meta { color: #777; }\n'
'body.dark tr.excluded td { opacity: 0.3; }\n'
'body.dark .summary span { color: #888; }\n'
)
JS = r"""
var DATA = {
equity_only: %s,
full: %s
};
var excluded = {};
function posKey(r) { return r.ticker + '_' + r.type; }
function validBasis(r) {
return typeof r.capital_basis === 'number' &&
isFinite(r.capital_basis) &&
r.capital_basis > 0;
}
function syncCutoff() {
var cutoff = (parseFloat(document.getElementById('cutoff').value) || 0) / 100;
var mode = document.getElementById('mode').value;
var rows = DATA[mode];
excluded = {};
rows.forEach(function(r) {
if (r.weight< cutoff)= excluded[posKey(r)]=true; });= }= function= render()= {= var= bankroll=parseFloat(document.getElementById('bankroll').value) ||= 0;= var= mode=document.getElementById('mode').value; var= rows=DATA[mode]; var= showOptionDetails=mode === 'full';= //= Show= mode= description= var= descEl=document.getElementById('mode-desc'); if= (mode=== 'equity_only')= {= descEl.textContent='Uses shares only; calls become long underlying and puts become short underlying.' ;= }= else= {= descEl.textContent='Uses deployed capital; option rows target 13F underlying notional and spend estimated premium.' ;= }= //= All= rows= shown;= excluded= rows= are= greyed= out= var= active=rows.filter(function(r) {= return= !excluded[posKey(r)]= &&= validBasis(r);= });= var= totalBasis=active.reduce(function(s, r)= {= return= s= += r.capital_basis;= },= 0);= var= allocated=0; var= unsizedCapital=0; var= computed=rows.map(function(r) {= var= key=posKey(r); var= isExcl=!!excluded[key]; var= hasBasis=validBasis(r); var= adjWeight=(!isExcl &&= hasBasis= &&= totalBasis=> 0) ?
r.capital_basis / totalBasis : 0;
var scale = (!isExcl && hasBasis && totalBasis > 0) ?
bankroll / totalBasis : 0;
var target = (!isExcl && hasBasis) ? r.reported_value * scale : null;
var capitalTarget = (!isExcl && hasBasis) ?
r.capital_basis * scale : 0;
var multiplier = r.multiplier || 1;
var isOption = r.instrument === 'option';
var direction = r.direction || 'long';
var sizingPrice = isOption ? r.underlying_price : r.price;
if (!sizingPrice || !r.price || isExcl || !hasBasis) {
if (!isExcl && hasBasis) unsizedCapital += capitalTarget;
return { ticker: r.ticker, type: r.type, weight: r.weight,
adjWeight: adjWeight, target: target,
excluded: isExcl, key: key,
instrument: r.instrument || 'stock',
strike: r.strike || null, expiry: r.expiry || null,
underlyingPrice: r.underlying_price || null,
priceAsOf: r.price_as_of || null,
direction: direction,
price: r.price, units: null, cost: null };
}
var units = Math.floor(target / (sizingPrice * multiplier));
var signedUnits = direction === 'short' ? -units : units;
var cost = units * r.price * multiplier;
allocated += cost;
return { ticker: r.ticker, type: r.type, weight: r.weight,
adjWeight: adjWeight, target: target,
excluded: isExcl, key: key, instrument: r.instrument || 'stock',
strike: r.strike || null, expiry: r.expiry || null,
underlyingPrice: r.underlying_price || null,
priceAsOf: r.price_as_of || null,
direction: direction,
price: r.price, units: signedUnits, cost: cost };
});
var html = '<table><thead><tr>';
html += '<th/><th>Ticker</th>';
html += '<th>Type</th>';
if (showOptionDetails) html += '<th class="r">Strike</th><th class="r">Expiry</th>';
html += '<th class="r">Weight</th>';
html += '<th class="r">Target</th>';
html += '<th class="r">Price</th>';
html += '<th class="r">Units</th><th class="r">Cost</th></tr></thead><tbody>';
computed.forEach(function(c) {
html += '<tr' += (c.excluded= ?= '= class="excluded" '= := '')= += '=>';
html += '<td class="cb"><input type="checkbox" data-key="' + c.key + '" '= += (c.excluded= ?= ''= := '= checked')= += '=/>';
html += '<td><strong>' + c.ticker + '</strong></td>';
var cls = c.type === 'put' ? 'tag-put' : c.type === 'call' ? 'tag-call' : 'tag-long';
var typeText = c.type;
if (mode === 'equity_only' && c.type === 'call') typeText = 'call as long';
if (mode === 'equity_only' && c.type === 'put') typeText = 'put as short';
html += '<td><span class="tag ' + cls + '">' + typeText + '</span></td>';
if (showOptionDetails) {
if (c.strike) {
html += '<td class="r">$' + c.strike.toFixed(0) + '</td>';
html += '<td class="r">' + (c.expiry || '\u2014') + '</td>';
} else {
html += '<td class="r">\u2014</td><td class="r">\u2014</td>';
}
}
html += '<td class="r">' + (c.excluded ? '0.0' : (c.adjWeight * 100).toFixed(1)) + '%%</td>';
html += '<td class="r">' + (c.excluded ? '$0.00' : (c.target == null ? 'N/A' : '$' + c.target.toFixed(2))) + '</td>';
var priceText = c.price != null ? '$' + c.price.toFixed(2) : 'N/A';
if (showOptionDetails && c.instrument === 'option' && c.priceAsOf) {
priceText += '<br><span class="muted">' + c.priceAsOf + '</span>';
}
html += '<td class="r">' + priceText + '</td>';
html += '<td class="r">' + (c.units != null ? c.units.toLocaleString() : 'N/A') + '</td>';
html += '<td class="r">' + (c.cost != null ? '$' + c.cost.toFixed(2) : 'N/A') + '</td>';
html += '</tr>';
});
html += '</tbody></table>';
html += '<div class="summary">';
html += '<div><span>' + (mode === 'full' ? 'Allocated' : 'Gross exposure') + ':</span> $' + allocated.toFixed(2) + '</div>';
html += '<div><span>Unsized capital:</span> $' + unsizedCapital.toFixed(2) + '</div>';
html += '<div><span>Residual:</span> $' + (bankroll - allocated - unsizedCapital).toFixed(2) + '</div>';
html += '</div>';
document.getElementById('output').innerHTML = html;
// Auto-resize iframe to fit content
try {
var el = window.frameElement;
if (el) el.style.height = document.body.scrollHeight + 'px';
} catch(e) {}
}
document.getElementById('bankroll').addEventListener('input', render);
document.getElementById('mode').addEventListener('change', function() { syncCutoff(); render(); });
document.getElementById('cutoff').addEventListener('input', function() { syncCutoff(); render(); });
document.getElementById('output').addEventListener('change', function(e) {
if (e.target.type === 'checkbox' && e.target.dataset.key) {
if (e.target.checked) {
delete excluded[e.target.dataset.key];
} else {
excluded[e.target.dataset.key] = true;
}
render();
}
});
// Dark mode
function isDark() {
try { return parent.document.documentElement.getAttribute('data-theme') === 'dark'; }
catch(e) { return window.matchMedia('(prefers-color-scheme: dark)').matches; }
}
function applyTheme() {
document.body.classList.toggle('dark', isDark());
}
applyTheme();
try {
new MutationObserver(applyTheme).observe(
parent.document.documentElement,
{ attributes: true, attributeFilter: ['data-theme'] });
} catch(e) {}
render();
""" % (json.dumps(eq_data), json.dumps(full_data))
BODY = (
'<div class="controls">\n'
'<label for="bankroll">Bankroll ($)</label>\n'
'<input type="number" id="bankroll" value="10000" min="0" step="100">\n'
'<label for="mode">Mode</label>\n'
'<select id="mode">\n'
'<option value="equity_only" selected=>Equity proxy</option>\n'
'<option value="full">Option proxy</option>\n'
'</select>\n'
'<label for="cutoff">Cutoff (%%)</label>\n'
'<input type="number" id="cutoff" value="0" min="0" max="100" '= '= step="0.5" style="width:80px">\n'
'</div>\n'
'<div id="mode-desc" class="meta" style="font-style:italic"/>\n'
'<div class="meta">\n'
' %s filing (filed %s) &middot; underlying prices as of %s\n'
'</div>\n'
'<div id="output"/>\n'
) % (quarter, filing_date, today)
html = (
'<!DOCTYPE html>\n<html>\n<head>\n'
'<meta charset="utf-8">\n'
'<meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate">\n'
'<style>\n' + CSS + '</style>\n'
'</head>\n<body>\n'
+ BODY
+ '<script>\n' + JS + '\n</script>\n'
'</body>\n</html>'
)
outpath = os.path.join(HUGO_BASE, 'static', 'images', 'sa-lp-calculator.html')
with open(outpath, 'w') as f:
f.write(html)
```</div></details><iframe src="/images/sa-lp-calculator.html" width="100%" height="580" style="border:none;" scrolling="no"/>
Based on the five filings to date, the fund files within 0–3 days of the 45-day deadline:
| Quarter end | 45-day deadline | Actual filing | Days early |
|-------------|-----------------|---------------|------------|
| 2024-12-31 | Feb 14 | Feb 12 | 2 |
| 2025-03-31 | May 15 | May 14 | 1 |
| 2025-06-30 | Aug 14 | Aug 14 | 0 |
| 2025-09-30 | Nov 14 | Nov 14 | 0 |
| 2025-12-31 | Feb 14 | Feb 11 | 3 |
So expect new disclosures around **February 14**, **May 15**, **August 14**, and **November 14**. If you decide to implement the copycat strategy, consider setting a calendar reminder. I’ll try to keep this note updated, but please let me know if anything looks out of date.
_With thanks to Bastian Stern and Jonas Vollmer for comments._
[^fn:1]: See [below](/notes/situational-awareness-lp/) for an estimate of the cost of these delays.
[^fn:2]: The code in the blocks that follow was written by Claude Opus 4.6 and audited by GPT-5.4.
[^fn:3]: MarketData’s Starter plan provides the historical chains and quotes needed here, but the tested Starter responses returned null historical Greek fields, so the code uses vendor delta when present and otherwise infers delta from the observed option mid price, underlying price, strike, and expiration. This Black-Scholes inference is used only for contract selection (to approximate the target |delta|≈0.15), never to reprice returns: every reported P&amp;L comes from an observed MarketData quote.
[^fn:4]: The single-switch assumption is a simplification: the fund likely makes multiple trades throughout the quarter. But since we only observe quarter-end snapshots, the uniform single-switch model is the most we can extract from the data. There is also a one-time boundary cost—the return of the initial portfolio during the first ~45 days before the copycat sees the first filing—but this is a startup effect, not a recurring feature of the delay, so we omit it from the analysis.
[^fn:5]: For the most recent transition, where \\(Q\_{i+1}\\) is not yet available, the gap estimate falls back to the simple comparison of \\(Q\_i\\) vs \\(Q\_{i-1}\\). A more accurate way to model this is to compute the historical returns using this simple comparison and compare them to the returns calculated as we now do, and then adjust the returns from the simple comparison for the most recent transition accordingly. This adjustment will increase the cumulative delay costs a bit, but probably not too much to bother: it only affects ~45/135 ≈ 33% of the days in the last quarter.]]></description></item><item><title>My Emacs packages</title><link>https://stafforini.com/notes/my-emacs-packages/</link><pubDate>Sat, 28 Feb 2026 05:00:07 +0000</pubDate><guid>https://stafforini.com/notes/my-emacs-packages/</guid><description>&lt;![CDATA[The following is a list of some Emacs packages I created. All these packages were originally developed for my own use only, and were private until very recently. I decided to make them public in the hope that they might be useful to others.
Besides these packages, I have developed dozens of extensions for a wide variety of packages and features. See my [Emacs config](/notes/my-emacs-config/) for details.
`anki-noter` {#anki-noter}
`anki-noter` uses a large language model (via `gptel`) to generate [Anki](https://apps.ankiweb.net/) flashcards from various source materials—buffers, files (including PDFs), and URLs—and inserts them as org-mode headings formatted for `anki-editor`. A template system lets users switch between prompt strategies for different domains, and a transient menu provides a single entry point for configuring source, target, and generation options.
[Full documentation](/notes/anki-noter/)
`annas-archive` {#annas-archive}
`annas-archive` provides Emacs integration for [Anna's Archive](https://annas-archive.li/), the largest existing search engine for shadow libraries. Given a search query (ISBN, DOI, or free-text title/author), it fetches matching results and can download files programmatically via the fast download API.
[Full documentation](/notes/annas-archive/)
`agent-log` {#agent-log}
`agent-log` lets you browse AI coding agent session logs in Emacs. Agents such as [Claude Code](https://docs.anthropic.com/en/docs/claude-code) and [Codex](https://openai.com/index/introducing-codex/) store complete conversation transcripts as JSONL files; this package renders them as readable Markdown files so that standard tools (`consult-ripgrep`, `dired`, `grep`) work natively on readable content.
[Full documentation](/notes/agent-log/)
`bib` {#bib}
`bib` provides a handful of conveniences for quickly retrieving bibliographic metadata for books, academic papers and films from within Emacs. Given only a title (and optionally an author), the package searches the relevant public APIs, picks the correct unique identifier (ISBN, DOI, IMDb / Letterboxd slug) and returns a ready-to-paste BibTeX entry or URL.
[Full documentation](/notes/bib/)
`codex` {#codex}
`codex` provides an Emacs interface to the [OpenAI Codex CLI](https://github.com/openai/codex), embedding the full agent TUI inside an `eat` or `vterm` terminal buffer. It supports multiple named sessions per project, sending commands with file and line context, a transient menu for all CLI slash commands, and an auto-configured hooks system that relays CLI lifecycle events back to Emacs.
[Full documentation](/notes/codex/)
`elfeed-ai` {#elfeed-ai}
`elfeed-ai` adds AI-powered content curation to [elfeed](https://github.com/skeeto/elfeed), the Emacs feed reader. It uses `gptel` to score each entry against a natural-language interest profile and surfaces only the content that matters to you, with a configurable daily budget to control API costs.
[Full documentation](/notes/elfeed-ai/)
`gdocs` {#gdocs}
`gdocs` provides bidirectional synchronization between org-mode files and [Google Docs](https://docs.google.com/). It lets you open, edit, create, and push Google Docs entirely from within Emacs, using org-mode as the native editing format, with OAuth 2.0 authentication, incremental diffing, and a side-by-side conflict resolution UI.
[Full documentation](/notes/gdocs/)
`gptel-plus` {#gptel-plus}
`gptel-plus` provides a few enhancements for [gptel](https://github.com/karthink/gptel), a package for interfacing with large language models in Emacs, including ex ante cost estimation, ex post cost calculation, context persistence and context file management.
[Full documentation](/notes/gptel-plus/)
`johnson` {#johnson}
`johnson` is a multi-format dictionary UI for Emacs. It provides the functionality of programs such as [GoldenDict](https://en.wikipedia.org/wiki/GoldenDict) and [StarDict](https://en.wikipedia.org/wiki/StarDict), allowing you to look up words across multiple dictionaries simultaneously and view formatted definitions in a dedicated Emacs buffer. The package is implemented entirely in Emacs Lisp, relying on Emacs 30.1's built-in sqlite support for efficient headword indexing.
[Full documentation](/notes/johnson/)
`kelly` {#kelly}
`kelly` is an Emacs Lisp implementation of the [Kelly criterion](https://en.wikipedia.org/wiki/Kelly_criterion), a formula used to determine the optimal size of a series of bets.
[Full documentation](/notes/kelly/)
`mullvad` {#mullvad}
`mullvad` collects a few functions for interfacing with [Mullvad](https://mullvad.net/), a VPN service, allowing you to connect to and disconnect from servers, and to use it programmatically for location-restricted services.
[Full documentation](/notes/mullvad/)
`org-indent-pixel` {#org-indent-pixel}
`org-indent-pixel` provides pixel-accurate `wrap-prefix` for variable-pitch Org buffers, fixing the progressive misalignment of continuation lines that occurs when `org-indent-mode` and `buffer-face-mode` are both active.
[Full documentation](/notes/org-indent-pixel/)
`org-table-wrap` {#org-table-wrap}
`org-table-wrap` provides visual word-wrapping for Org mode tables that overflow the window width. It uses overlays to display a wrapped rendering with Unicode box-drawing characters, following the same reveal-on-enter pattern as `org-appear`.
[Full documentation](/notes/org-table-wrap/)
`pangram` {#pangram}
`pangram` provides an Emacs interface to the Pangram Labs API for detecting AI-generated content in text. Given a buffer or an active region, it sends the text to Pangram's inference endpoint and visually highlights segments classified as AI-generated or AI-assisted using distinct overlay faces.
[Full documentation](/notes/pangram/)
`pdf-tools-pages` {#pdf-tools-pages}
`pdf-tools-pages` is a simple extension for [pdf-tools](https://github.com/vedang/pdf-tools) that supports extracting a selection of pages from a PDF into a new file and deleting a selection of pages from the current PDF.
[Full documentation](/notes/pdf-tools-pages/)
`sgn` {#sgn}
`sgn` is a Signal messenger client for Emacs, forked from [signel](https://github.com/keenban/signel). It communicates with [signal-cli](https://github.com/AsamK/signal-cli) via JSON-RPC and persists all message history in a local SQLite database with FTS5 full-text search.
[Full documentation](/notes/sgn/)
`spofy` {#spofy}
`spofy` is a full-featured [Spotify](https://developer.spotify.com/) client for Emacs. It communicates with the Spotify Web API to provide playback control, search, browsing, playlist management, and library management, with a dashboard buffer, mode-line display, and a `transient` popup for quick access to all commands.
[Full documentation](/notes/spofy/)
`trx` {#trx}
`trx` is a full-featured [Transmission](https://transmissionbt.com/) BitTorrent client for Emacs. It communicates with a Transmission daemon over its JSON RPC protocol, providing torrent management, file selection, peer inspection, tracker manipulation, speed and ratio limits, and alternative speed scheduling ("turtle mode"), all from within `tabulated-list-mode` buffers. A renamed and improved fork of Mark Oteiza's `transmission.el`.
[Full documentation](/notes/trx/)
`wikipedia` {#wikipedia}
`wikipedia` provides a comprehensive Emacs interface for Wikipedia, building on top of `mediawiki.el`. It adds higher-level workflows for editing, reviewing, and managing Wikipedia content, including a grouped watchlist with unread tracking and background diff prefetching, revision history browsing, user inspection, XTools statistics, local draft management, and an offline mirror backed by SQLite.
[Full documentation](/notes/wikipedia/)]]></description></item><item><title>Anki decks from the LessWrong community</title><link>https://stafforini.com/notes/anki-decks-from-the-lesswrong-community/</link><pubDate>Fri, 24 May 2013 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/anki-decks-from-the-lesswrong-community/</guid><description>&lt;![CDATA[In a [recent LessWrong post](/r/discussion/lw/h2m/solved_problems_repository/), Qiaochu Yuan noted that "various mnemonic techniques like memory palaces, along with spaced repetition, seem to more or less solve the problem of memorization." The list below is an attempt to compile all existing Anki decks created by Less Wrong users, in the hope that they will be of help to others in memorizing the corresponding material. (Anki is arguably the [most popular](http://www.gwern.net/Spaced%20repetition#popularity) spaced repetition software.) If you know of a deck not included here, please mention it in the comments section and I'll add it to the list. Thanks!
Please note that I have excluded some of my own Anki decks, which may not be of interest to members of the LessWrong community; all such decks may be found here.
**Update (August 2019)**: The links to many of the decks below died in the intervening years, in part because AnkiWeb deletes shared decks with low download activity. Fortunately, I managed to regenerate almost all of these decks from my own master deck. To prevent further loss of content, I have now uploaded all of the extant decks to my server and added backup links to these archived versions.
Accounting and Finance {#accounting-and-finance}
- [Ivo Welch's _Corporate Finance_ (2nd ed.)](https://ankiweb.net/shared/info/2822835243) (incomplete) [[archived](anki-decks-from-the-lesswrong-community/Ivo_Welchs_Corporate_Finance_2nd_ed.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
- [Mark Piper's _Accounting Made Simple_](https://ankiweb.net/shared/info/3028692523) [[archived](anki-decks-from-the-lesswrong-community/Mark_Pipers_Accounting_Made_Simple.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
AI {#ai}
- [AI policy](https://ankiweb.net/shared/info/1004270498) [[archived](anki-decks-from-the-lesswrong-community/AI_Policy.apkg)], by Roxanne Heston.
- [Nick Bostrom's _Superintelligence_](https://ankiweb.net/shared/info/955711794) [[archived](anki-decks-from-the-lesswrong-community/Nick_Bostroms_Superintelligence.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
Apps and Software {#apps-and-software}
- [Chrome keyboard shortcuts (Windows)](https://ankiweb.net/shared/info/119714158) [[archived](anki-decks-from-the-lesswrong-community/Chrome_Windows_keyboard_shortcuts.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
- Google Docs keyboard shortcuts (Mac), by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/) [not currently available].
- [Mac keyboard shortcuts](https://ankiweb.net/shared/info/290629306) [[archived](<anki-decks-from-the-lesswrong-community/Mac shortcuts.apkg=>)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
- [Vimium keyboard shortcuts](https://ankiweb.net/shared/info/1615416392) [[archived](anki-decks-from-the-lesswrong-community/Vimium_keyboard_shortcuts.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo).
Art and Music {#art-and-music}
- 100 Greatest Paintings of All Time [[archived](anki-decks-from-the-lesswrong-community/Paintings.apkg)], by [Risto_Saarelma](http://lesswrong.comhttp//www.lesswrong.com/user/Risto_Saarelma/). Based on [lukeprog](http://www.lesswrong.com/user/lukeprog/overview/)'s [listology](http://www.listology.com/lukeprog/list/100-greatest-paintings-all-time-pics/), itself based on Piero Scaruffi's [1000 Greatest Western Paintings of All Times](http://www.scaruffi.com/art/greatest.html).
- [Ear Training (chords)](https://ankiweb.net/shared/info/1935085027) [[archived](anki-decks-from-the-lesswrong-community/Ear_training_chords.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/). Contains sound samples of about 40 of the most common chords, in root position.
Communication {#communication}
- [Adele Faber &amp; Elaine Mazlish's _How to Talk So Kids Will Listen and Listen So Kids Will Talk_](http://becomingeden.com/wp-content/uploads/2013/09/How-to-Talk-So-Kids-Will-Listen.apkg) [[archived](anki-decks-from-the-lesswrong-community/How-to-Talk-So-Kids-Will-Listen.apkg)], by [divia](http://www.lesswrong.com/user/divia).
- [Kerry Patterson, Joseph Grenny, Ron McMillan &amp; Al Switzle's _Crucial Conversations_](http://becomingeden.com/wp-content/uploads/2013/09/Crucial-Conversations.apkg) [[archived](anki-decks-from-the-lesswrong-community/Crucial-Conversations.apkg)], by [divia](http://www.lesswrong.com/user/divia).
- [Marshall Rosenberg's _Nonviolent Communication_](http://becomingeden.com/wp-content/uploads/2013/09/Nonviolent-Communication.apkg) [[archived](anki-decks-from-the-lesswrong-community/Nonviolent-Communication.apkg)], by [divia](http://www.lesswrong.com/user/divia).
Dreaming and Psychedelia {#dreaming-and-psychedelia}
- [MAPS's Responding to Difficult Psychedelic Experiences](http://becomingeden.com/wp-content/uploads/2013/09/Responding-to-Difficult-Psychedelic-Experiences.apkg) [[archived](anki-decks-from-the-lesswrong-community/Responding-to-Difficult-Psychedelic-Experiences.apkg)], by [divia](http://www.lesswrong.com/user/divia).
- [Stephen LaBerge &amp; Howard Rheingold's _Exploring the World of Lucid Dreaming_](http://becomingeden.com/wp-content/uploads/2013/09/Exploring-the-World-of-Lucid-Dreaming.apkg) [[archived](anki-decks-from-the-lesswrong-community/Exploring-the-World-of-Lucid-Dreaming.apkg)], by [divia](http://www.lesswrong.com/user/divia).
Languages {#languages}
- [German cases](https://ankiweb.net/shared/info/1053501313) [[archived](anki-decks-from-the-lesswrong-community/German_cases.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
- [La Rochefoucauld's _Maxims_](https://ankiweb.net/shared/info/6517124) [[archived](anki-decks-from-the-lesswrong-community/La_Rochefoucaulds_Maxims_French-English.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
- [Wittgenstein's _Tractatus_](https://ankiweb.net/shared/info/1429345215) [[archived](anki-decks-from-the-lesswrong-community/Wittgensteins_Tractatus_German-English.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
Miscellaneous {#miscellaneous}
- [10 Rules for Dealing with the Police](http://becomingeden.com/wp-content/uploads/2013/09/10-Rules-for-Dealing-with-the-Police.apkg) [[archived](anki-decks-from-the-lesswrong-community/10-Rules-for-Dealing-with-the-Police.apkg)], by [divia](http://www.lesswrong.com/user/divia).
Mnemonics {#mnemonics}
- [How to Formulate Knowledge](http://alexvermeer.com/download/How-to-Formulate-Knowledge.anki) [[archived](anki-decks-from-the-lesswrong-community/How-to-Formulate-Knowledge.anki)], by [alexvermeer](http://www.lesswrong.com/user/alexvermeer/). Based on Piotr Wozniak's [20 Rules of Formulating Knowledge](http://www.supermemo.com/articles/20rules.htm). [See also [divia's similar deck](http://becomingeden.com/wp-content/uploads/2013/09/20-Rules-of-Formatting-Knowledge.apkg)]
- [The Major Mnemonic Memory System](http://alexvermeer.com/download/Major-Mnemonic-Memory-System.anki) [[archived](anki-decks-from-the-lesswrong-community/Major-Mnemonic-Memory-System.anki)], by [alexvermeer](http://www.lesswrong.com/user/alexvermeer/). Contains cards for the sounds associated with 0 through 9 as well as 100 pegs.
Philosophy {#philosophy}
- [David Chalmers’s _Constructing the World_](https://ankiweb.net/shared/info/3916604735) [[archived](anki-decks-from-the-lesswrong-community/David_Chalmers_-_Constructing_the_World.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
Psychology and Psychiatry {#psychology-and-psychiatry}
- [List of personality disorders](https://ankiweb.net/shared/info/2734101689) [[archived](anki-decks-from-the-lesswrong-community/List_of_Personality_Disorders.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/). From Theodore Millon's _Personality Disorders in Modern Life_.
- [Peter Gray's Psychology (6th ed.)](https://ankiweb.net/shared/info/872250656) (incomplete) [[archived](anki-decks-from-the-lesswrong-community/Peter_Grays_Psychology_6th_ed.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
Rationality and Cognitive Science {#rationality-and-cognitive-science}
- [List of Cognitive Biases and Fallacies](https://ankiweb.net/shared/info/970971960) [[archived](anki-decks-from-the-lesswrong-community/List_of_Cognitive_Biases_and_Fallacies.apkg)], by [phob](http://www.lesswrong.com/user/phob/overview/). Based on Wikipedia's [List of cognitive biases](http://en.wikipedia.org/wiki/List_of_cognitive_biases) and [List of fallacies](http://en.wikipedia.org/wiki/List_of_fallacies).
- Rationality Habits Checklist [[archived](<anki-decks-from-the-lesswrong-community/Rationality habits= checklist.apkg=>)], by [Qiaochu_Yuan](http://www.lesswrong.com/user/Qiaochu_Yuan/). Based on [this post](/lw/fc3/checklist_of_rationality_habits/).
Self-Help {#self-help}
- [Carol Dweck's _Mindset: The New Psychology of Success_](http://alexvermeer.com/download/Mindset.anki) [[archived](anki-decks-from-the-lesswrong-community/Mindset.anki)], by [alexvermeer](http://www.lesswrong.com/user/alexvermeer/).
- [David Burns's _The Feeling Good Handbook_](https://ankiweb.net/shared/info/1806513892) [[archived](anki-decks-from-the-lesswrong-community/David_Burnss_The_Feeling_Good_Handbook.apkg)], by [Pablo](http://www.lesswrong.com/user/Pablo_Stafforini/).
- [Get Motivated](http://alexvermeer.com/download/Get-Motivated.apkg) [[archived](anki-decks-from-the-lesswrong-community/Get-Motivated.apkg)], by [alexvermeer](http://www.lesswrong.com/user/alexvermeer/). An experimental deck for getting yourself motivated using the advice from Piers Steel's _The Procrastination Equation_ and the author's own [How to Get Motivated](http://alexvermeer.com/getmotivated/) poster.
- Richard Wiseman's _59 Seconds_ [[archived](<anki-decks-from-the-lesswrong-community/59 Seconds.apkg=>)], by [Dorikka](http://www.lesswrong.com/user/Dorikka/).
- [Tim Ferriss's _The Four Hour Work Week_](http://alexvermeer.com/download/The-Four-Hour-Work-Week.anki) [[archived](anki-decks-from-the-lesswrong-community/The-Four-Hour-Work-Week.anki)], by [alexvermeer](http://www.lesswrong.com/user/alexvermeer/).
Sequences and Related LW Material {#sequences-and-related-lw-material}
- [A Human's Guide to Words](http://becomingeden.com/wp-content/uploads/2013/09/A-Humans-Guide-to-Words.apkg) [[archived](anki-decks-from-the-lesswrong-community/A-Humans-Guide-to-Words.apkg)], by [divia](http://www.lesswrong.com/user/divia).
- [Eliezer Yudkowsky's "Twelve Virtues of Rationality"](http://alexvermeer.com/download/Twelve+Virtues+of+Rationality.anki) [[archived](<anki-decks-from-the-lesswrong-community/Twelve Virtues= of= Rationality.apkg=>)], by [alexvermeer](http://www.lesswrong.com/user/alexvermeer/).
- [LessWrong wiki](https://www.lesswrong.com/posts/Xd8aQsZroPYN4CZXM/lesswrong-wiki-as-anki-deck) [[archived](<anki-decks-from-the-lesswrong-community/LessWrong Wiki.apkg=>)], by [mapnoterritory](https://www.lesswrong.com/users/mapnoterritory).
- [Mysterious Answers to Mysterious Questions](http://becomingeden.com/wp-content/uploads/2013/09/Mysterious-Answers-to-Mysterious-Questions.apkg) [[archived](anki-decks-from-the-lesswrong-community/Mysterious-Answers-to-Mysterious-Questions.apkg)], by [divia](http://www.lesswrong.com/user/divia).
Statistics {#statistics}
- [Quick Bayes Table](http://alexvermeer.com/download/Quick-Bayes-Table.anki) [[archived](anki-decks-from-the-lesswrong-community/Quick-Bayes-Table.apkg)], by [alexvermeer](http://www.lesswrong.com/user/alexvermeer/). A simple deck of cards for internalizing conversions between percent, odds, and decibels of evidence.
_With thanks to Lorenzo Buonanno._]]></description></item><item><title>Derek Parfit: a bibliography</title><link>https://stafforini.com/notes/derek-parfit-a-bibliography/</link><pubDate>Sat, 25 May 2013 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/derek-parfit-a-bibliography/</guid><description>&lt;![CDATA[_![](/ox-hugo/derek-parfit-portrait.jpg)_
What interests me most are the metaphysical questions whose answers can affect our emotions, and have rational and moral significance. Why does the Universe exist? What makes us the same person throughout our lives? Do we have free will? Is time's passage an illusion?
---
- {{< cite= "Parfit2021ImprovingScanlonS"=>}}
- {{< cite= "Parfit2017WhatMattersVolume"=>}}
- {{< cite= "Parfit2017FuturePeopleNonidentity"=>}}
- {{< cite= "Parfit2017Responses"=>}}
- {{< cite= "Parfit2016ConflictingReasons"=>}}
- {{< cite= "Parfit2016CanWeAvoid"=>}}
- {{< cite= "Parfit2016PersonalOmnipersonalDuties"=>}}
- {{< cite= "Parfit2012AnotherDefencePriority"=>}}
- {{< cite= "Parfit2012WeAreNot"=>}}
- {{< cite= "Parfit2011WhatMattersVolumeb"=>}}
- {{< cite= "Parfit2011WhatMattersVolume"=>}}
- {{< cite= "Parfit2007PersonsBodiesHuman"=>}}
- {{< cite= "Parfit2007PersonalIdentityWhat"=>}}
- {{< cite= "Parfit2006KantArgumentsHis"=>}}
- {{< cite= "Parfit2006Normativity"=>}}
- {{< cite= "Parfit2004Postcript"=>}}
- {{< cite= "Parfit2003JustifiabilityEachPerson"=>}}
- {{< cite= "Parfit2004WhatWeCould"=>}}
- {{< cite= "Parfit2001RationalityReasons"=>}}
- {{< cite= "Parfit2001BombsCoconutsRational"=>}}
- {{< cite= "Parfit1999ExperiencesSubjectsConceptual"=>}}
- {{< cite= "Parfit1998WhyAnythingWhy2"=>}}
- {{< cite= "Parfit1998WhyAnythingWhy"=>}}
- {{< cite= "Parfit1997EqualityPriority"=>}}
- {{< cite= "Parfit1997ReasonsMotivation"=>}}
- {{< cite= "Parfit1996ActsOutcomesReply"=>}}
- {{< cite= "Parfit1995UnimportanceIdentity"=>}}
- {{< cite= "Parfit1995InterviewDerekParfit"=>}}
- {{< cite= "Parfit1993IndeterminacyIdentityReply"=>}}
- {{< cite= "Parfit1993PaulSeabrightPluralism"=>}}
- {{< cite= "Parfit1992WhoYouThink"=>}}
- {{< cite= "Parfit1992PuzzleRealityWhy"=>}}
- {{< cite= "Cowen1992SocialDiscountRate"=>}}
- {{< cite= "Parfit1991IsaiahBerlin"=>}}
- {{< cite= "Parfit1991WhyDoesUniverse"=>}}
- {{< cite= "Parfit1991EqualityPriority"=>}}
- {{< cite= "Parfit1988WhatWeTogether"=>}}
- {{< cite= "Parfit1987Response"=>}}
- {{< cite= "Parfit1987ReplySterba"=>}}
- {{< cite= "Parfit1987DividedMindsNature"=>}}
- {{< cite= "Parfit1986Comments"=>}}
- {{< cite= "Parfit1986OverpopulationAndQuality"=>}}
- {{< cite= "Parfit1984RationalityTime"=>}}
- {{< cite= "Parfit1984ReasonsPersons"=>}}
- {{< cite= "Parfit1983EnergyPolicyFurther"=>}}
- {{< cite= "Parfit1983EnergyPolicyFurthera"=>}}
- {{< cite= "Dennett1982SummaryOfDiscussion"=>}}
- {{< cite= "Parfit1982PersonalIdentityRationality"=>}}
- {{< cite= "Parfit1982FutureGenerationsFurther"=>}}
- {{< cite= "Parfit1981Correspondence"=>}}
- {{< cite= "Parfit1981AttackSocialDiscount"=>}}
- {{< cite= "Parfit1979CommonsenseMoralitySelfdefeating"=>}}
- {{< cite= "Parfit1979Correspondence"=>}}
- {{< cite= "Parfit1979PrudenceMoralityPrisoner"=>}}
- {{< cite= "Parfit1978InnumerateEthics"=>}}
- {{< cite= "Parfit1976LewisPerryWhat"=>}}
- {{< cite= "Parfit1976DoingBestOur"=>}}
- {{< cite= "Parfit1976RightsInterestsPossible"=>}}
- {{< cite= "Parfit1973LaterSelvesMoral"=>}}
- {{< cite= "Parfit1971ImportanceSelfidentity"=>}}
- {{< cite= "Parfit1971PersonalIdentity"=>}}
- {{< cite= "Parfit1964EtonCollegeChronicle"=>}}
- {{< cite= "Parfit1964Fish"=>}}
- {{< cite= "Cheetham1964EtonMicrocosm"=>}}
- {{< cite= "Parfit1963LikePebbles"=>}}
- {{< cite= "Parfit1962PhotographOfComtesse"=>}}
_With thanks to David Edmonds, Johan Gustafsson, and Matthew van der Merwe._
---
What now matters most is how we respond to various risks to the survival of humanity. We are creating some of these risks, and we are discovering how we could respond to these and other risks. If we reduce these risks, and humanity survives the next few centuries, our descendants or successors could end these risks by spreading through this galaxy.
Life can be wonderful as well as terrible, and we shall increasingly have the power to make life good. Since human history may be only just beginning, we can expect that future humans, or supra-humans, may achieve some great goods that we cannot now even imagine. In Nietzsche's words, there has never been such a new dawn and clear horizon, and such an open sea.
{{< figure= src="/ox-hugo/venice-lagoon-storm.jpg" alt="Venice lagoon under storm clouds, photographed by Derek Parfit">}}]]></description></item><item><title>Summary of ‘Time management’, by Randy Pausch</title><link>https://stafforini.com/notes/summary-of-time-management-by-randy-pausch/</link><pubDate>Thu, 08 Aug 2013 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/summary-of-time-management-by-randy-pausch/</guid><description>&lt;![CDATA[{{< figure= src="/ox-hugo/randy-pausch.jpg" alt="Randy Pausch">}}
Randy Pausch's [lecture on time management](http://youtu.be/oTugjssqOT0) is, in my opinion, the best presentation on productivity techniques ever recorded. I have watched the talk at least half a dozen times, I learned something new and important on each occasion. The summary below leaves out the funny jokes and engaging stories, focusing exclusively on the actionable bits of advice.
- The talk addresses the following topics:
- How to set goals.
- How to avoid wasting time.
- How to deal with your boss.
- How to delegate.
- How to handle stress and procrastination
- Americans are very bad at dealing with time.  By contrast, they are very good at dealing with money.
- But time and money are very similar.  A key question to ask is, “Who much is an hour of your time worth?” Knowing this figure is very helpful for making decisions involving trade-offs, such as whether you should do something yourself or pay someone else to do it instead. _Think about time and money as if they are almost the same thing_.
- So time, like money, needs to be managed.
- The talk borrows heavily from the following books:
- Cathy Collins, [_Time Management for Teachers_](http://www.amazon.com/Time-Management-Teachers-Techniques-Skills/dp/0139217010)
- Kenneth Blanchard &amp; Spencer Johnson, _[[<http://ebookoid.net>_?m=ebook&amp;id=fKSLiRGZclCxr5j483WtLUJ82S0Uao5fk0yvo7Wpq/KvjS5PX6H/E+sd7Q63migf][The One Minute Manager]]/
- Stephen Covey, _The 7 Habits of Highly Effective People_
- Dick Lohr, [_Taking control of Your Workday_](<summary-of-time-management-by-randy-pausch/Lohr -= Taking= control= of= your= workday.zip=>)
- The problem of “time famine” is _systemic_, just as the problem of African famine is.  As such, it requires long-term interventions that target underlying fundamental processes.
- Time management is ultimately about living a more enriching, fulfilling life.  It's about having more fun.
- Being successful doesn't make you manage your time well. Managing your time well makes you successful.  _Someone who is less skilled could still be more successful by developing the relevant metaskills_: the skills to optimize whatever skills you do have.
- Every time you are about to spend time doing something, ask yourself:
- Why am I doing this? What is the goal?
- Why will I succeed?
- What happens if I chose not to do it?
- Don't focus on doing things right.  Focus instead on _doing the right things_.
- Keep a list of the things you want to accomplish, and whenever you catch yourself not doing something that will get you closer to one of those goals, ask yourself why you are doing it.
- 80% of your value results from 20% of your input, so focus on this 80%, work hard at it, and ignore the rest.
- Planning is critical, and must be done at multiple levels: daily, weekly, monthly and yearly.
- Yes, you will have to change the plan, but you can't change your plan unless you have one.  And having a plan that is subject to change is much better than having no plan at all.
- Keys to having a working to-do list:
- Break down projects into small tasks.
- Do the ugliest thing first.
- _Tackle important, non-urgent tasks before you tackle unimportant, urgent ones._
- It's crucial to keep your desk clear, since it's then much easier to process anything that lands on it.
- Touch each piece of paper only once. Apply this same principle to email.
- A filing system is absolutely essential. Have a single designated place where all papers are stored.
- Use multiple monitors. The cost is trivial.
- Have a calendar.  Even if you can keep commitments in your mind, you'd be using up scarce brain space.
- Rules for using the telephone:
- Always stand when talking on the phone.  This will motivate you to keep your calls short.
- Start your calls by announcing your goals. “Sue, this is Randy. I'm calling you because I have three things I want to get done.”
- Have something on your desk that you are interested in doing next, so that you are not tempted to talk for longer than necessary.
- Call people just before lunchtime.  They'll be eager to eat, and as a result they will keep the conversation short.
- Things to have on your desk
- Speakerphone. You'll be able to do other stuff while waiting on the phone.
- Headset.  You'll be able to use the phone while doing other stuff (e.g. exercising).
- Address stamper.
- Box of Kleenex
- Stack of thank-you cards.
- Thank-you notes are very important: they are a tangible way of telling people how much you appreciate them, and they are so rarely used that people will remember you.
- Recycling bin.  Use it for papers only.  Since it will take weeks to fill up, you can recover papers recently thrown out by mistake.
- Notepad.
- Post-it notes.
- Alternative systems may work for you.  But you do need to think about what does work for you.
- Make your office comfortable for you, but _optionally_ comfortable for others. E.g., have foldable chairs, which you can unfold only for guests whom you must meet for sufficiently long periods.
- Consider the _opportunity cost_ of doing things.  Every time you do something unimportant, you are _not_ doing something important instead.
- Learn to say No.  A useful formula: “I'll do it if nobody else steps forward.”
- Find your creative time and defend it ruthlessly. Match your energy levels to the effort different tasks require.
- Minimize the frequency and length of interruptions.  Each interruption takes about 12 minutes of your time on average.
- Turn off email notifications.
- Say “I'm in the middle of something right now” or “I only have five minutes”.  If you want, you can extend that time later.
- If someone just won't leave, walk to the door, compliment them, thank them, and shake their hand.
- Keep a time journal. Don't wait until the end to complete it; update it regularly throughout the day.
- A time journal gives you valuable information about how you spend your time, allow you to identify tasks that
- you can delegate to somebody else
- you can do more efficiently
- are particularly important or unimportant
- If you have a gap between two appointments, create a “fake appointment” and spend that time productively.
- Be efficient, not effective. What matters is the overall outcome.
- Doing things at the last minute is really expensive.
- If you have something that isn't due for a long time, make up a fake deadline and act like it's real.
- Identify the underlying psychological reason why you are procrastinating about something.
- Fear of embarrassment.
- Fear of failure.
- Anxiety about asking someone for something.
- How to delegate:
- You grant authority with responsibility.
- Do the ugliest job yourself.
- Treat your people well.
- Be specific
- A specific task
- A specific time
- A specific penalty or reward
- Challenge your people
- Have a written record
- Make it clear which tasks are the most important
- How to deal with others:
- Reinforce behavior that you want repeated: praise and thank people.
- If you don't want things to be delegated back to you, don't learn how to do them!
- Meetings:
- People should be fully present
- They shouldn't last more than an hour
- There should always be an agenda
- Keep one-minute minutes.
- How to deal with email
- Don't delete past messages.
- Don't send requests to a group of people; email people individually.
- If people don't respond within 48 hours, it's okay to nag them.
- If you have a boss,
- write things down
- ask them
- when is your next meeting with them
- what things they want to be done by when
- who can you turn for help
- remember that your boss wants a result, not an excuse
- General advice on vacations:
- Callers should get two options
- “I'm not at the office, but contact x”
- “Call back when I'm back”
- It's not a vacation if you are reading email
- General advice:
- Kill your television.
- Turn money into time.
- E.g., pay someone to mow your lawn.
- Above all else, make sure you eat, sleep and exercise enough.
- Never break a promise, but renegotiate it if need be.
- Recognize that most things are pass/fail.
- Get feedback.
- **_Time is all we have, and one day you may find that you have less than you think_.**]]></description></item><item><title>Carlos Santiago Nino: a bibliography</title><link>https://stafforini.com/notes/carlos-santiago-nino-a-bibliography/</link><pubDate>Sun, 08 Feb 2015 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/carlos-santiago-nino-a-bibliography/</guid><description>&lt;![CDATA[{{< figure= src="/ox-hugo/carlos-santiago-nino-portrait.jpg" alt="Carlos Santiago Nino">}}
Carlos Nino was a publicly engaged intellectual of rare integrity and brilliance. In his dedication to human rights, the rule of law, and constitutional legitimacy he combined passion with wisdom and analytic clarity. His inexhaustible courage in fighting to restore decency to his nation provides a model for others working in the wake of dictatorship. We are fortunate to have in his writings a record of his remarkable thought and experience.
Thomas Nagel
I started compiling this bibliography back when I was an undergraduate student at the university of Buenos Aires, and continued to work on it intermittently over the following few years. In 2007, my hard drive was damaged in an accident and most of the data stored in it was lost. I had since then assumed that the document containing the bibliography was among the affected files. A week ago, however, I stumbled upon a copy of it. Thinking that there might be sufficient interest in this information among legal scholars and other academics, I spent a few hours over the following days updating the references and formatting the bibliography for online publication. I would like to thank the staff at various institutions whose libraries I consulted in the course of preparing this document, in particular Universidad de San Andrés, Universidad Torcuato Di Tella, Universidad de Buenos Aires, Sociedad Argentina de Análisis Filosófico, Centro de Investigaciones Filosóficas, University of Oxford (Bodleian Law Library), Balliol College and University of Toronto (Robarts Library).
---
1966 {#1966}
1. {{< cite= "Nino1966EfectosIlicitoCivil"=>}}. Reprinted in {{< cite= "Nino2007EfectosIlicitoCivil"=>}}.
2. {{< cite= "Nino1966ReviewNorbertoEduardo"=>}}. Reprinted in {{< cite= "Nino2008ReviewNorbertoEduardo"=>}}.
1967 {#1967}
1. {{< cite= "Nino1967LesionesRetoricaProblema"=>}}. Reprinted in {{< cite= "Nino2008LesionesRetorica"=>}}.
2. {{< cite= "Bacque1967TemaInterpretacionLey"=>}}. Reprinted in {{< cite= "Nino2007InterpretacionLeyRoss"=>}}.
1968 {#1968}
1. {{< cite= "Nino1968ReviewRichardRobinson"=>}}.
1969 {#1969}
1. {{< cite= "Nino1969DefinicionDelito"=>}}. Reprinted in {{< cite= "Nino2008DefinicionDelito"=>}}.
1970 {#1970}
1. {{< cite= "Nino1970DogmaticaJuridicaSus"=>}}. Published with corrections as {{< cite= "Nino1984ConsideracionesSobreDogmatica"=>}}.
2. {{< cite= "Nino2007RacionalIrracionalDogmatica"=>}}.
1972 {#1972}
1. {{< cite= "Nino1972ConcursoDerechoPenal"=>}}.
2. {{< cite= "Nino1972PequenaHistoriaDolo"=>}}. Based on a 1970 conference reprinted in {{< cite= "Nino2008PequenaHistoriaDolo"=>}}.
1973 {#1973}
1. {{< cite= "Nino1973DefinicionDerechoNorma"=>}}.
2. {{< cite= "Nino1973ConceptosBasicosDerecho"=>}}.
1974 {#1974}
1. {{< cite= "Nino1974ConceptoSistemaJuridico"=>}}.
2. {{< cite= "Nino1984ConsideracionesSobreDogmatica"=>}}.
3. {{< cite= "Nino1974ConceptoAccionDerecho"=>}}.
1975 {#1975}
1. {{< cite= "Nino1975CienciaDerechoInterpretacion"=>}}.
1976 {#1976}
1. {{< cite= "Nino1976ConceptoValidezProblema"=>}}. Reprinted in {{< cite= "Nino2007ConceptoValidezProblema"=>}} and, with corrections, as {{< cite= "Nino1985ConceptoValidezKelsenAplicado"=>}}.
1977 {#1977}
1. {{< cite= "Nino1977GeneralStrategyCriminal"=>}}. Published in Spanish, with corrections, as {{< cite= "Nino1980LimitesResponsabilidadPenal"=>}}.
1978 {#1978}
1. {{< cite= "Nino1978ConcepcionesFundamentalesLiberalismo"=>}}. Reprinted in {{< cite= "Nino2007ConcepcionesFundamentalesLiberalismo"=>}}.
2. {{< cite= "Nino1978ConfusionsKelsenConcept"=>}}. Reprinted, in abridged form, as {{< cite= "Nino1998ConfusionsKelsenAbridged"=>}}. Reprinted in Spanish, with corrections, as {{< cite= "Nino1985ConceptoValidezKelsen"=>}}.
3. {{< cite= "Nino2008SobreQueNos"=>}}.
1979 {#1979}
1. {{< cite= "Nino1979AlgunosModelosMetodologicos"=>}}.
2. {{< cite= "Nino1979MismoOmitirQue"=>}}. Reprinted in {{< cite= "Nino1985MismoOmitirGracia"=>}}, and in {{< cite= "Nino2008MismoOmitirQue"=>}}.
3. {{< cite= "Nino1979EsTenenciaDrogas"=>}}. Reprinted in {{< cite= "Nino2000TenenciaDrogasDeGreiff"=>}}.
4. {{< cite= "Nino1979FundamentacionLegitimaDefensa"=>}}. Reprinted in {{< cite= "Nino2008FundamentacionLegitimaDefensa"=>}}.
1980 {#1980}
1. {{< cite= "Nino1980IntroduccionAnalisisDerecho"=>}}. Italian translation: {{< cite= "Nino1996IntroduzioneAllAnalisi"=>}}. Excerpt reprinted as {{< cite= "Nino1995ConceptoResponsabilidad"=>}}.
2. {{< cite= "Nino1980LimitesResponsabilidadPenal"=>}}. Revised Spanish translation of {{< cite= "Nino1977GeneralStrategyCriminal"=>}}.
3. {{< cite= "Nino1980DworkinLegalPositivism"=>}}.
4. {{< cite= "Nino1980DworkinDisolucionControversia"=>}}. Revised Spanish translation of {{< cite= "Nino1980DworkinLegalPositivism"=>}}. Reprinted in {{< cite= "Nino1980DworkinDisolucionRevistaCiencias"=>}}. Reprinted with corrections as {{< cite= "Nino1985SuperacionControversia"=>}}.
5. {{< cite= "Nino1980LibreAlbedrioResponsabilidad"=>}}. Reprinted in {{< cite= "Nino2008LibreAlbedrio"=>}}.
6. {{< cite= "Nino1980AlfRossMaestro"=>}}.
7. Translation of {{< cite= "Oppenheim1980LineamientosAnalisisLogico"=>}}.
1981 {#1981}
1. {{< cite= "Nino1981ConceptosDerecho"=>}}. Reprinted with corrections as {{< cite= "Nino1985EnfoqueEsencialista"=>}}.
2. {{< cite= "Nino1981RazonesPrescripcionesRespuesta"=>}}. Reprinted with corrections as {{< cite= "Nino1985SonPrescripciones"=>}}.
3. {{< cite= "Nino1981RazonesPrescripcionesRespuesta"=>}}. Reprinted in {{< cite= "Nino2007RazonesPrescripciones"=>}}.
4. {{< cite= "Nino1981RespuestaMalamudGoti"=>}}. Reprinted in {{< cite= "Nino2008RespuestaMalamudGoti"=>}}.
5. {{< cite= "Nino1981PenaMuerteConsentimiento"=>}}. Reprinted in {{< cite= "Nino2008PenaMuerteConsentimiento"=>}}.
6. {{< cite= "Nino1981ReviewStuartHampshire"=>}}. Mistakenly reprinted in _Análisis filosófico_, vol. 1, no. 2, pp. 76-77.
7. {{< cite= "Nino1981ReviewCarlosAlchourron"=>}}.
1982 {#1982}
1. {{< cite= "Nino1982LegitimaDefensaFundamentacion"=>}}.
2. {{< cite= "Nino1982ConcursoContinuacionDelitos"=>}}. Reprinted in {{< cite= "Nino2008ConcursoContinuacionDelitos"=>}}.
1983 {#1983}
1. {{< cite= "Bulygin1983LenguajeDerechoHomenaje"=>}}.
2. {{< cite= "Nino1983ConceptoPoderConstituyente"=>}}. Reprinted with corrections as {{< cite= "Nino1985CompetenciaConstituyente"=>}}.
3. {{< cite= "Nino1983CasoConcienciaCuestion"=>}}.
4. {{< cite= "Nino1983ConsensualTheoryPunishment"=>}}. Reprinted in {{< cite= "Nino1993ConsensualTheoryDuff"=>}}, and in {{< cite= "Nino1995ConsensualTheorySimmons"=>}}. Spanish translation: {{< cite= "Nino2008TeoriaConsensualPena"=>}}.
5. {{< cite= "Nino1983NuevaEstrategiaPara"=>}}. Reprinted with corrections as {{< cite= "Nino1985ValidezNormasDeFacto"=>}}.
6. {{< cite= "Nino1983ConcepcionAlfRoss"=>}}. Reprinted in {{< cite= "Nino2007ConcepcionAlfRoss"=>}}.
7. {{< cite= "Nino1983LegalEthicsMetaphysics"=>}}. Spanish translation: {{< cite= "Nino2007EticaLegalEntre"=>}}.
8. {{< cite= "Nino1985ResponsabilidadJuridicaRepresion"=>}}. Published in {{< cite= "Verbitsky1987CivilesMilitares"= "pp.= 389-391"=>}}.
9. {{< cite= "Nino1983ReviewJulioMaier"=>}}.
10. {{< cite= "Nino1983LeyAmnistia"=>}}.
11. {{< cite= "Nino1966Prologo"=>}}.
1984 {#1984}
1. {{< cite= "Nino1984EticaDerechosHumanos"=>}}.
2. {{< cite= "Nino1984LegalNormsReasons"=>}}. Reprinted in Spanish, with corrections, as {{< cite= "Nino1985NormasJuridicasRazones"=>}}.
3. {{< cite= "Nino1984ProyectoReformulacionObediencia"=>}}. Published in {{< cite= "Verbitsky1987CivilesMilitares"= "pp.= 399-407"=>}}.
4. {{< cite= "Nino1984LibertyEqualityCausality"=>}}. Spanish translation: {{< cite= "Nino2007LibertadIgualdadCausalidad"=>}}.
5. {{< cite= "Nino1984LimitsEnforcementMorality"=>}}. Spanish translation: {{< cite= "Nino2008LimitesAplicacionMoral"=>}}.
6. {{< cite= "Nino1984RossReformaProcedimiento"=>}}. Reprinted, with corrections as, {{< cite= "Nino1985PuedeSistemaJuridico"=>}}.
7. {{< cite= "Nino1984ProblemasAbiertosFilosofia"=>}}.
8. {{< cite= "Nino1984ReformaEstudiosAbogacia"=>}}.
1985 {#1985}
1. {{< cite= "Nino1985ValidezDerecho"=>}}.
2. {{< cite= "Nino1985LimitacionesTeoriaHart"=>}}. Reprinted in {{< cite= "Nino2007LimitacionesTeoriaHart"=>}}.
3. {{< cite= "Nino1985HumanRightsPolicy"=>}}.
4. {{< cite= "Nino1985HombreSusDerechos"=>}}. Reprinted with corrections in {{< cite= "Nino1989HombreSusDerechos"=>}}.
5. {{< cite= "Nino1985QueEsDemocracia"=>}}. Reprinted in {{< cite= "Nino2007QueEsDemocracia"=>}}.
1986 {#1986}
1. {{< cite= "Nino1986ReformaConstitucionalDictamen"=>}}.
2. {{< cite= "Nino1986InformePracticaConstitucional"=>}}.
3. {{< cite= "Nino1986ExposicionCoordinador"=>}}.
4. {{< cite= "Nino1986DesafioUniversidadArgentina"=>}}.
5. {{< cite= "Nino1986DoesConsentOverride"=>}}. Spanish translation: {{< cite= "Nino2008PuedeConsentimientoAnular"=>}}.
6. {{< cite= "Nino1986HechosMoralesConcepcion"=>}}. Reprinted in {{< cite= "Nino1989HechosMorales"=>}}.
7. {{< cite= "Nino1986RacionalismoFundamentacionEtica"=>}}. Reprinted in {{< cite= "Nino1989RacionalismoFundamentacion"=>}}, and in {{< cite= "Nino1992RacionalismoSchuster"=>}}.
8. {{< cite= "Nino1986ParadojaIrrelevanciaMoral"=>}}. Reprinted in {{< cite= "Nino1989ParadojaIrrelevancia"=>}} and in {{< cite= "Nino1990ParadojaVigo"=>}}. Italian translation: {{< cite= "Nino1990ParadojaIrrilevanzaMartino"=>}}.
9. {{< cite= "Nino1986JustificacionDemocraciaEntre"=>}}. Reprinted in {{< cite= "Nino2007JustificacionDemocracia"=>}}. Italian translation: {{< cite= "Nino1990GiustificazioneMartino"=>}}.
10. {{< cite= "Nino1986ParticipacionRemedioCrisis"=>}}.
11. {{< cite= "Nino1986ConceptoDerechoHart"=>}}. Revised version of {{< cite= "Nino1985LimitacionesTeoriaHart"=>}}. Reprinted in {{< cite= "Nino2007ConceptoDerechoHart"=>}}.
12. {{< cite= "Nino1986DemocraciaVerdadMoral"=>}}. Reprinted in {{< cite= "Nino2007DemocraciaVerdadMoral"=>}}.
13. {{< cite= "Nino1986DictamenReformaConstitucional"=>}}.
14. {{< cite= "Nino1986EscepticismoEticoJustificacion"=>}}. Reprinted in {{< cite= "Nino2007EscepticismoEtico"=>}}.
1987 {#1987}
1. {{< cite= "Nino1987IntroduccionFilosofiaAccion"=>}}.
2. {{< cite= "Nino1987ReformaConstitucionalSegundo"=>}}.
3. {{< cite= "Nino1987VotoObligatorio"=>}}.
4. {{< cite= "Nino1987SpeedyTrialsArgentina"=>}}.
5. {{< cite= "Nino1987ConceptMoralPerson"=>}}. Spanish translation: {{< cite= "Nino1987ConceptoPersonaValdivia"=>}}. Another Spanish translation: {{< cite= "Nino2007ConceptoPersonaMoral"=>}}.
6. {{< cite= "Nino1987BegriffRechtfertigung"=>}}.
7. {{< cite= "Nino1987CuatrilemaConsecuencialismo"=>}}. Reprinted in {{< cite= "Nino2007CuatrilemaConsecuencialismo"=>}}.
8. {{< cite= "Nino1987PrologoNudelman"=>}}.
9. {{< cite= "Nino1987DerechoConstitucionalFrente"=>}}.
10. {{< cite= "Nino1987DefensaDemocraciaConvergencia"=>}}.
11. {{< cite= "Nino1987SistemasElectorales"=>}}.
1988 {#1988}
1. {{< cite= "Nino1988Radiodifusion"=>}}.
2. {{< cite= "Nino1988PresidencialismoVsParlamentarismo"=>}}.
3. {{< cite= "Nino1988PresidencialismoVsParlamentarismoArticulo"=>}}.
4. {{< cite= "Nino1988LiberalismoComunitarismo"=>}}.
5. {{< cite= "Nino1988PoliticaDerechosHumanos"=>}}.
6. {{< cite= "Nino1986ConstructivismoEpistemologicoEntre"=>}}. Reprinted in {{< cite= "Nino1989ConstructivismoEpistemologico"=>}}.
7. {{< cite= "Nino1988ManHisRights"=>}}.
8. {{< cite= "Nino1988RelacionEntreDerecho"=>}}. Reprinted in {{< cite= "Nino2007RelacionDerechoJusticia"=>}}.
1989 {#1989}
1. {{< cite= "Nino1989ConstructivismoEtico"=>}}.
2. {{< cite= "Nino1989EticaDerechosHumanos"=>}}.
3. {{< cite= "Nino1989DerivacionPrincipiosResponsabilidad"=>}}. Reprinted in {{< cite= "Nino2008DerivacionPrincipios"=>}}.
4. {{< cite= "Nino1989Presentacion"=>}}.
5. {{< cite= "Nino1989JusticiaConciencia"=>}}.
6. {{< cite= "Nino1989CommunitarianChallengeLiberal"=>}}. Reprinted in {{< cite= "Nino1992CommunitarianChallengeReprint"=>}}.
7. {{< cite= "Nino1989TransitionDemocracyCorporatism"=>}}.
8. {{< cite= "Nino1989ConsolidatingDemocracy"=>}}.
9. {{< cite= "Nino1989MoralDiscourse"=>}}. Spanish translation: {{< cite= "Nino2007DiscursoMoralDerechos"=>}}.
10. {{< cite= "Nino1989ConcienciaCrisis"=>}}.
11. {{< cite= "Nino1989DemocracyCriminalLaw"=>}}. Spanish translation: {{< cite= "Nino2008DerechoPenalDemocracia"=>}}.
12. {{< cite= "Nino1989FilosofiaControlJudicial"=>}}.
13. {{< cite= "Nino1981RespuestaBayon"=>}}. Reprinted in {{< cite= "Nino2007RespuestaBayon"=>}}.
14. {{< cite= "Nino1989JustificacionDemocraciaObligacion"=>}}. Reprints {{< cite= "Nino1989EticaDerechosHumanos"= "pp.= 225-254"=>}}.
15. {{< cite= "Nino1989IndultosConcienciaMoral"=>}}.
1990 {#1990}
1. {{< cite= "Nino1990AutonomiaNecesidadesBasicas"=>}}. Reprinted in {{< cite= "Nino2007AutonomiaNecesidadesBasicas"=>}}.
2. {{< cite= "Nino1990SobreDerechosMorales"=>}}. Reprinted in {{< cite= "Nino2007SobreDerechosMorales"=>}}.
3. {{< cite= "Nino1990EntrevistaGenaroCarrio"=>}}.
4. {{< cite= "Nino1990ConstitucionComoConvencion"=>}}.
5. {{< cite= "Nino1991LiberalismoConservadorLiberal"=>}}. Reprinted in {{< cite= "Nino1990LiberalismoConservadorSistema"=>}} and in {{< cite= "Nino2007LiberalismoConservador"=>}}.
6. {{< cite= "Nino1990PracticaDerechosFundamentales"=>}}.
7. {{< cite= "Nino1990DiscursoBlandoSobre"=>}}.
8. {{< cite= "Nino1990PresidencialismoJustificacionEstabilidad"=>}}.
1991 {#1991}
1. {{< cite= "Nino1991EthicsHumanRights"=>}}. Revised English translation of {{< cite= "Nino1989EticaDerechosHumanos"=>}}.
2. {{< cite= "Nino1991PresidencialismoEstabilidadDemocratica"=>}}.
3. {{< cite= "Nino1991FundamentosAlcancesControl"=>}}.
4. {{< cite= "Nino1991PresidencialismoJustificacionEstabilidadIncollection"=>}}. Reprints {{< cite= "Nino1990PresidencialismoJustificacionEstabilidad"=>}}?.
5. {{< cite= "Nino1991HuidaFrentePenas"=>}}. Reprinted in {{< cite= "Nino2008HuidaFrentePenas"=>}}.
6. {{< cite= "Nino1991EpistemologicalMoralRelevance"=>}}. Revised English translation of {{< cite= "Nino1986ParadojaIrrelevanciaMoral"=>}}? Spanish translation: {{< cite= "Nino2007RelevanciaMoralEpistemica"=>}}.
7. {{< cite= "Nino1991DutyPunishAbuses"=>}}. Reprinted, with a few truncated paragraphs, in {{< cite= "Nino1995DutyPunishKritz"=>}}.
8. {{< cite= "Nino1991FundamentosControlJudicial"=>}}.
9. {{< cite= "NinoDemocraciaEpistemicaPuesta"=>}}. Reprinted in {{< cite= "Nino2007DemocraciaEpistemicaPuesta"=>}}.
10. {{< cite= "Nino1991ConsideracionesAlternativas"=>}}.
1992 {#1992}
1. {{< cite= "Nino1992FundamentosDerechoConstitucional"=>}}.
2. {{< cite= "Nino1992PaisMargenLey"=>}}.
3. {{< cite= "Nino1992PresidencialismoPuestoAPrueba"=>}}.
4. {{< cite= "Nino1992AutonomiaPersonalInvestigacion"=>}}.
5. {{< cite= "Nino1992Rights"=>}}.
6. {{< cite= "Nino1992DeliberativeDemocracyComplexity"=>}}.
7. {{< cite= "Nino1992HiperpresidencialismoArgentino"=>}}.
8. {{< cite= "Nino1992IntroductionRights"=>}}.
9. {{< cite= "Nino1992QueReformaConstitucional"=>}}.
10. {{< cite= "Nino1992ReplicaMariaInes"=>}}. Reprinted in {{< cite= "Nino2007ReplicaPazos"=>}}.
11. {{< cite= "Nino1993RespuestaMoresoNavarro"=>}}. Reprinted in {{< cite= "Nino2007RespuestaMoresoNavarro"=>}}.
12. {{< cite= "Nino1992PresentacionFundamentos"=>}}.
13. {{< cite= "Nino1992EticaAnaliticaActualidad"=>}}. Reprinted in {{< cite= "Nino2007EticaAnaliticaActualidad"=>}}.
14. {{< cite= "Nino1992ConsecuencialismoDebateEtico"=>}}. Reprinted in {{< cite= "Nino2007ConsecuencialismoDebate"=>}}.
15. {{< cite= "Nino1992FoundationsDeliberativeConception"=>}}.
16. {{< cite= "Nino1992SomeThoughtsAbortion"=>}}. Revised English translation of {{< cite= "Nino2013FundamentosDerechoConstitucional"= "pp.= 236-244,= 252-254"=>}}.
17. {{< cite= "Nino1992AutonomiaConstitucional"=>}}.
1993 {#1993}
1. {{< cite= "Nino1993BreveNotaSulla"=>}}. Spanish translation: {{< cite= "Nino2007BreveNotaSobre"=>}}.
2. {{< cite= "Nino1993PhilosophicalReconstructionJudicial"=>}}. Reprinted in {{< cite= "Nino1994PhilosophicalReconstructionRosenfeld"=>}}. Revised English translation of {{< cite= "Nino1991FundamentosControlJudicial"=>}}?.
3. {{< cite= "Nino1993SocialRights"=>}}. Revised English translation of {{< cite= "Nino2013FundamentosDerechoConstitucional"= "pp.= 398-403"=>}}. Spanish translation: {{< cite= "Nino1993DerechosSocialesDerechoSociedad"=>}}. Reprinted as {{< cite= "Nino2000SobreDerechosSociales"=>}}. Another Spanish translation: {{< cite= "Nino2007SobreDerechosSociales"=>}}.
4. {{< cite= "Nino1993CicloConferencias"=>}}.
5. {{< cite= "Nino1993DerechoMoralPolitica"=>}}. Reprinted in {{< cite= "Nino2007DerechoMoralPoliticaArticulo"=>}}.
6. {{< cite= "Nino1979ContextoSocialRegimen"=>}}.
7. {{< cite= "Nino2007Justicia"=>}}. Reprinted in {{< cite= "Nino1996JusticiaGarzonValdes"=>}} and in {{< cite= "Nino2007JusticiaReprint"=>}}.
8. {{< cite= "Nino1993SeAcaboDebate"=>}}. Reprinted as {{< cite= "Nino2008ReplicaZaffaroni"=>}}.
9. {{< cite= "Nino1993DebateConstitutionalReform"=>}}. German translation: {{< cite= "Nino1995DebatteUberVerfassungsreform"=>}}.
10. {{< cite= "Nino1993ExerciseJudicialReview"=>}}.
11. {{< cite= "Nino1993DifficultiesTransitionProcess"=>}}.
12. {{< cite= "Nino1993PresidencialismoReformaConstitucional"=>}}.
13. {{< cite= "Nino1993DirittoMoralePolitica"=>}}.
14. {{< cite= "Nino1993TransitionDemocracyPresidentialism"=>}}. Revised version of {{< cite= "Nino1989TransitionDemocracyCorporatism"=>}}.
15. {{< cite= "Nino1993RadicalEvilManuscript"=>}}. Published, with revisions by Owen Fiss, as {{< cite= "Nino1996RadicalEvilTrial"=>}}.
16. {{< cite= "Nino1993ConstitutionDeliberativeManuscript"=>}}. Published, with revisions by Owen Fiss, as {{< cite= "Nino1996ConstitutionDeliberativeDemocracy"=>}}.
17. {{< cite= "Nino1993PreocupacionesMetafilosoficas"=>}}.
18. {{< cite= "Nino1994PositivismCommunitarianismHuman"=>}}. Spanish translation: {{< cite= "Nino2007PositivismoComunitarismo"=>}}.
19. {{< cite= "Nino1993WhenJustPunishment"=>}}.
1994 {#1994}
1. {{< cite= "Nino1994DerechoMoralPolitica"=>}}. Italian translation: {{< cite= "Nino1999DirittoComeMorale"=>}}.
2. {{< cite= "Nino1994PositivismCommunitarianismHuman"=>}}. Reprinted in {{< cite= "Nino1996PositivismAlston"=>}}.
3. {{< cite= "Nino1994ReformaMenemistaSigno"=>}}.
1995 {#1995}
1. {{< cite= "Nino1995EticaDerechosHumanos"=>}}. Reprinted in {{< cite= "Nino2007EticaDerechosHumanos"=>}}.
1996 {#1996}
1. {{< cite= "Nino1996RadicalEvilTrial"=>}}. Prints, with corrections, the 1993 manuscript. Spanish translation: {{< cite= "Nino1997JuicioMalAbsoluto"=>}}.
2. {{< cite= "Nino1996ConstitutionDeliberativeDemocracy"=>}}. Prints, with corrections, the 1993 manuscript. Spanish translation: {{< cite= "Nino1997ConstitucionDemocraciaDeliberativa"=>}}.
3. {{< cite= "Nino1996HyperpresidentialismConstitutionalReform"=>}}.
4. {{< cite= "Nino1996KantVersusHegel"=>}}. Reprinted in {{< cite= "Nino1996KantVersusHegelLaPolitica"=>}}.
1998 {#1998}
1. {{< cite= "Nino1998ObjecionConcienciaLibertad"=>}}.
1999 {#1999}
1. {{< cite= "Nino1999SubjetivismoObjetivismoDerecho"=>}}. Reprinted in {{< cite= "Nino2008SubjetivismoObjetivismo"=>}}.
2001 {#2001}
1. {{< cite= "Nino2001Utilitarismo"=>}}. Reprinted in {{< cite= "Nino2007Utilitarismo"=>}}.
2007 {#2007}
1. {{< cite= "Nino2007DerechoMoralPolitica"=>}}.
2. {{< cite= "Nino2007DerechoMoralPoliticaa"=>}}.
2008 {#2008}
1. {{< cite= "Maurino2008FundamentosDeDerechoPenal"=>}}.
2013 {#2013}
1. {{< cite= "Nino2013OchoLeccionesEtica"=>}}.
2. {{< cite= "Nino2013TeoriaJusticiaDemocracia"=>}}.]]></description></item><item><title>Effective altruism syllabi</title><link>https://stafforini.com/notes/effective-altruism-syllabi/</link><pubDate>Wed, 13 Apr 2016 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/effective-altruism-syllabi/</guid><description>&lt;![CDATA[I've been helping some friends create a syllabus for a course on effective altruism, and as part of this effort I compiled a list of existing reading lists on the topic. Am I missing anything?
Syllabi {#syllabi}
- [Luc Bovens &amp; Stephan Chambers](https://www.lse.ac.uk/resources/calendar2019-2020/courseGuides/PH/2019_PH332.htm) (London School of Economics)
- [Tom Bry-Chevalier &amp; Antonin Broi](https://drive.google.com/file/d/11KR_9WqoTO-jvs8jHpTeK-_nbKpBctFc/view) (Paris Sciences et Lettres University)
- [Mark Budolfson](https://www.budolfson.com/teaching/effective-altruism-normative-ethics-and-the-environment) (University of Vermont)
- [Stephen Casper](https://docs.google.com/document/d/1PO9nuEhVW4mDLleR8SkERJ3fIxJMUcYjWlBaVB9YbLE/edit) (Harvard University)
- [Richard Yetter Chappell](https://rychappell.substack.com/p/updated-syllabus-on-ealongtermism) [[earlier course](http://www.philosophyetc.net/2016/04/teaching-effective-altruism.html)] (University of Miami)
- [Izzy Gainsburg](https://docs.google.com/document/d/1hVMTHtJRG_gmscIZDa3aEUE4SAx8jDoQip1EfpB938I/edit) (University of Michigan)
- [Gilad Feldman](https://forum.effectivealtruism.org/posts/Shyfng2xGBHvG5qAK/courses-and-collaborative-books-on-effective-altruism-with) (University of Hong Kong)
- [Hilary Greaves](https://globalprioritiesinstitute.org/michaelmas-term-2017-foundational-issues-in-effective-altruism/) (University of Oxford)
- [Hilary Greaves &amp; Frank Arntzenius](http://users.ox.ac.uk/~mert2255/teaching/grad/EA_HT17/EA2017-complete-set-handouts.pdf) (University of Oxford)
- [Ted Lechterman](/ox-hugo/ETHICSOC-155-Syllabus-3-21-18.pdf) (Stanford University)
- [William MacAskill &amp; Christian Tarsney](https://globalprioritiesinstitute.org/topics-in-global-priorities-research/) (University of Oxford)
- [David Manley](http://davidmanley.squarespace.com/new-page) (University of Michigan, Ann Arbor)
- [Geoffrey Miller](https://forum.effectivealtruism.org/posts/Bd4xeHeNgBywrofW6/psychology-of-effective-altruism-course-syllabus-1) (University of New Mexico)
- [Theron Pummer &amp; Tim Mulgan](http://theronpummer.com/wp-content/uploads/2016/09/EA-module.pdf) (University of St Andrews)
- [Sophie Rose](https://docs.google.com/document/d/16Mi7J3PeUilsVW3W_P6bC4XJFjkDmNMIjlhaJ9MJpOs/edit)
- [Rohin Shah and others](https://disq.us/url?url=https%3A%2F%2Fdrive.google.com%2Fdrive%2Fu%2F0%2Ffolders%2F0B2DBUaKSmI6OZ1hqcW44VVpVejQ%3AAaNsuEz-leyrjzZuYiaYqUjNcEQ&cuid=2312277) [[more details](http://www.stafforini.com/blog/effective-altruism-syllabi/#comment-3209143838)] (University of California, Berkeley)
- [Peter Singer](https://www.coursera.org/learn/altruism) (Coursera)
- [Bonnie Talbert](</ox-hugo/Talbert - The theory and practice of altruism.docx>) (Harvard University)
- [Kerry Vaughan](/ox-hugo/CONTEMPORARY-MORAL-ISSUES-SUMMER-2014-SYLLABUS.docx) (Rice University)
- [Kristine West &amp; Jeff Johnson](/ox-hugo/St.CatherineUniversityEASyllabus.pdf) (St. Catherine University)
Other reading lists {#other-reading-lists}
- [Tyler Alterman](https://medium.com/@tyleralterman/recommended-reading-for-pareto-fellows-38f26179cd3a#.57zbnsnyn)
- [Richard Batty &amp; Oisin Moran](http://effective-altruism.com/ea/5f/effective_altruism_reading_list/)
- [Jakub Kraus](https://docs.google.com/document/d/1Xwhu3HGG2BtjLNpLcbFGLdmZm2kPbDi1-XlBukYf-0o/edit?usp=sharing)
- [Ben Kuhn](http://www.benkuhn.net/ea-reading)
- [Stefan Schubert and others](https://docs.google.com/document/d/1guIbqsX3QUeGwmGfnMfxKhpXr7o12nWpnoUDFSbYk6I/edit)
- [Pablo Stafforini](/notes/bibliography-for-a-course-on-effective-altruism/)
- [Students for High-Impact Charity](http://disq.us/url?url=http%3A%2F%2Fwww.shicschools.org%2Fcurriculum%3Aj2guexkyCEgFFFj7AMt53qSX3Aw&cuid=2312277)
**Update (February 2020)**: Julia Wise has compiled a similar list [here](https://forum.effectivealtruism.org/posts/Y8mBXCKmkS9eBokhG/ea-syllabi-and-teaching-materials).
**Update (July 2021)**: I have created a separate list with [courses on longtermism](/notes/courses-on-longtermism/).
_With thanks to Niel Bowerman, Michael Chen, Pepper, Rohin Shah and Bastian Stern._]]></description></item><item><title>Courses on longtermism</title><link>https://stafforini.com/notes/courses-on-longtermism/</link><pubDate>Sat, 28 Feb 2026 05:00:07 +0000</pubDate><guid>https://stafforini.com/notes/courses-on-longtermism/</guid><description>&lt;![CDATA[Years ago I wrote a [blog post](/notes/effective-altruism-syllabi/) listing all the university courses on effective altruism I was able find. I have since tried to keep the list updated, as I stumble upon new courses or people draw my attention to them. As a number of courses have recently been offered specifically on [longtermism](https://forum.effectivealtruism.org/tag/longtermism) and related topics, I figured that instead of adding them to the original list, I could create a new one with this more restricted focus.
If you think I'm missing anything, as always, please let me know.
- [Are we doomed?](https://docs.google.com/document/d/1YSs8_COvHQBiE2msm37C2Nt6Sc29GcJJq3fpl2Tii0I/edit?fbclid=IwAR3G2K5CYjkWjKG4-DJ5g_MCr3P3PjsSM9VNucfDKuAU1CVwJc7IGZjx6Yw) (Daniel Holz &amp; James A. Evans, University of Chicago)
- [Catastrophic risk: technologies and policy](https://www.schneier.com/academic/courses/catastrophic-risk/) (Bruce Schneier, Harvard University)
- [Effective altruism and the future of humanity](https://rychappell.substack.com/p/updated-syllabus-on-ealongtermism) (Richard Yetter Chappell, University of Miami)
- [Ethics and the future](</ox-hugo/Kagan - Reading Assignments with note.pdf>) (Shelly Kagan, Yale University)
- [Ethics, evidence, and policy](https://static1.squarespace.com/static/57be816a579fb351c73571ad/t/6193d71dc07efb2c8ed455f7/1637078813174/Ethics__Evidence__and_Policy__Syllabus.pdf) (Rush T. Stewart, Ludwig-Maximilians-Universität München)
- [Existential risk](https://theprecipice.com/syllabus) (Toby Ord, University of Oxford)
- [Existential risks](https://forum.effectivealtruism.org/posts/wmAQavcKjWc393NXP/example-syllabus-existential-risks) (Simon Friederich, University of Groningen)
- [Existential risks introductory course](https://docs.google.com/document/d/1gFXI1ccvd_G78LvNfwoRNMojud1IYptZU0oyUY4wrPU/edit) (Cambridge Existential Risks Initiative)
- [Global catastrophe since 1750](https://history.yale.edu/sites/default/files/files/2017-01%20rankin%20-%20catastrophe.pdf) (William Rankin, Yale University)
- [Longtermism and existential risk](https://www.rug.nl/filosofie/organization/news-and-events/events/2020/ppe/simon-friederich-andreas-schmidt-longtermism-and-existential-risk?fbclid=IwAR3tOuP__augoxmUbBIZM2-9eK1OKT0vevYT2YHkjUdlbcW8cVNBCN04vjI) (Simon Friederich &amp; Andreas Schmidt)
- [Preventing human extinction](https://drive.google.com/file/d/1FnvDSjIW071L_5kuEafJiq_o5QtrRaKa/view) ([additional readings](https://docs.google.com/document/d/1EnCaOpFl7D7anspk5g-wNbRm_9_5vymVOMO7pBndzdM/edit)) (Paul Edwards &amp; Steve Luby, Stanford University)
- [Safeguarding the future](https://canvas.mit.edu/courses/14137) (Kevin Esvelt &amp; Michael Specter, Massachusetts Institute of Technology)
- [The end of the world](https://jsr.droppages.com/courses/gesm-spring-2021.html) (Jeff Sanford Russell, University of Southern California)
- [Topics in global priorities research](https://globalprioritiesinstitute.org/topics-in-global-priorities-research/) (William MacAskill &amp; Christian Tarsney, University of Oxford)
(Dr Schneier informs me that, regrettably, the materials for his course on catastrophic risk have not been preserved.)
_With thanks to Prof. Shelly Kagan for sharing the syllabus for his course (and adding a helpful introductory note) and to Bastian Stern for discovering many of the courses listed._]]></description></item><item><title>Wild animal welfare: a bibliography</title><link>https://stafforini.com/notes/wild-animal-welfare-a-bibliography/</link><pubDate>Thu, 06 Jun 2013 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/wild-animal-welfare-a-bibliography/</guid><description>&lt;![CDATA[This bibliography is primarily based on {{< cite= "-Horta2012PublicationsInEnglish"=>}}, {{< cite= "-Dorado2015EthicalInterventionsWild"=>}}, and the research that [Aron Vallinder](http://www.vallinder.se/) and I did for a paper on wild animal welfare that we once planned to write. If you know of relevant material not included in the list below, please [let me know](/notes/contact/).
---
- {{< cite= "Aaltola2010AnimalEthicsArgument"=>}}
- {{< cite= "Alward2000NaiveArgumentMoral"=>}}
- {{< cite= "Benatar2001WhyNaiveArgument"=>}}
- {{< cite= "Bovenkerk2003ActNotAct"=>}}
- {{< cite= "Bruers2015PredationProcreationProblems"=>}}
- {{< cite= "Carpendale2015WelfareBiologyExtension"=>}}
- {{< cite= "Clark1979RightsWildThings"=>}}
- {{< cite= "Clarke2006PopulationDynamicsAnimal"=>}}
- {{< cite= "Cowen2003PolicingNature"=>}}
- {{< cite= "Cunha2015IfNaturalEntities"=>}}
- {{< cite= "Dawkins1995GodUtilityFunction"=>}}
- {{< cite= "Donaldson2011ZoopolisPoliticalTheory"=>}}
- {{< cite= "Donaldson2013DefenseAnimalCitizens"=>}}
- {{< cite= "Dorado2015EthicalInterventionsWild"=>}}
- {{< cite= "Ebert2012InnocentThreatsProblem"=>}}
- {{< cite= "Everett2001EnvironmentalEthicsAnimal"=>}}
- {{< cite= "Faria2015AnimalsNeedProblem"=>}}
- {{< cite= "Faria2015MakingDifferenceBehalf"=>}}
- {{< cite= "Faria2015DisentanglingObligationsAssistance"=>}}
- {{< cite= "Favre1979WildlifeRightsEver"=>}}
- {{< cite= "Fink2005PredationArgument"=>}}
- {{< cite= "Gould1982NonmoralNature"=>}}
- {{< cite= "Hadley2006DutyToAid"=>}}
- {{< cite= "Hettinger1994ValuingPredationRolston"=>}}
- {{< cite= "Hills2010UtilitarianismContractualismDemandingness"=>}}
- {{< cite= "Horta2010DisvalueNatureIntervention"=>}}
- {{< cite= "Horta2010EthicsEcologyFear"=>}}
- {{< cite= "Horta2010DebunkingIdyllicView"=>}}
- {{< cite= "Horta2013ZoopolisInterventionState"=>}}
- {{< cite= "Horta2015ProblemEvilNature"=>}}
- {{< cite= "Hutchins1987WildlifeConservationAnimal"=>}}
- {{< cite= "Jamieson1990RightsJusticeDuties"=>}}
- {{< cite= "Kirkwood1996EthicsInterventionsWelfare"=>}}
- {{< cite= "BruceLauber2007RoleEthicalJudgments"=>}}
- {{< cite= "Mannino2015HumanitarianInterventionNature"=>}}
- {{< cite= "McKelvie2015SeekingIncreaseAwareness"=>}}
- {{< cite= "McMahan2010MeatEaters"=>}}
- {{< cite= "McMahan2010PredatorsResponse"=>}}
- {{< cite= "McMahan2015MoralProblemPredation"=>}}
- {{< cite= "Moen2016EthicsOfWild"=>}}
- {{< cite= "Thornhill2006AnimalLiberationistResponses"=>}}
- {{< cite= "Mosquera2015HarmTheyInflict"=>}}
- {{< cite= "Musschenga2002NaturalnessAnimalWelfare"=>}}
- {{< cite= "Naess1991ShouldWeTry"=>}}
- {{< cite= "Ng1995WelfareBiologyEvolutionary"=>}}
- {{< cite= "Paez2015IntuitionsGoneAstray"=>}}
- {{< cite= "Paez2015RefusingHelpInflicting"=>}}
- {{< cite= "Palmer2015ViewThatWe"=>}}
- {{< cite= "Pearce2015WelfareStateElephants"=>}}
- {{< cite= "Raterman2008EnvironmentalistLamentPredation"=>}}
- {{< cite= "Regan2004CaseAnimalRights"=>}}
- {{< cite= "Rollin1981AnimalRightsHuman"=>}}
- {{< cite= "Rolston1992DisvaluesNature"=>}}
- {{< cite= "Sagoff1984AnimalLiberationEnvironmental"=>}}
- {{< cite= "Sapontzis1984Predation"=>}}
- {{< cite= "Sapontzis1987MoralsReasonAnimals"=>}}
- {{< cite= "Simmons2009AnimalsPredatorsRight"=>}}
- {{< cite= "Singer1973FoodThoughtReply"=>}}
- {{< cite= "SittlerAdamczewski2016ConsistentVegetarianismSuffering"=>}}
- {{< cite= "Sozmen2013HarmWildFacing"=>}}
- {{< cite= "Sozmen2015RelationsMoralObligations"=>}}
- {{< cite= "Tomasik2015ImportanceOfWild"=>}}
- {{< cite= "Torres2015CaseInterventionNature"=>}}
- {{< cite= "Young2006StatusVermin"=>}}
_With thanks to Tom Bradschetl and Ricardo Torres._]]></description></item><item><title>California Proposition 2: a bibliography</title><link>https://stafforini.com/notes/california-proposition-2-a-bibliography/</link><pubDate>Sun, 09 Jun 2013 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/california-proposition-2-a-bibliography/</guid><description>&lt;![CDATA[This bibliography was prepared for a research project on California Proposition 2 that I undertook a while ago, while volunteering for [Effective Animal Activism](http://www.effectiveanimalactivism.org/). As I became increasingly busy with many other activities, the project was eventually put on hold. I hope to resume work at some point in the future; in the meantime, I thought I should at least make this material public, in the hope that it might inspire others to do research in this and related areas.
- {{< cite= "Bell2005ReviewRecentPublications"=>}}
- {{< cite= "Corbin2011WhatFactorsInfluence"=>}}
- {{< cite= "Kuykendall2012SelectedNewspaperCoverage"=>}}
- {{< cite= "Lovvorn2008CaliforniaPropositionWatershed"=>}}
- {{< cite= "Lusk2010EffectPropositionDemand"=>}}
- {{< cite= "Newman2008FiscalEconomicEffects"=>}}
- {{< cite= "Richards2011MediaAdvertisingBallot"=>}}
- {{< cite= "Sumner2010EconomicsRegulationsHen"=>}}
- {{< cite= "Thapar2011TakingLiveStock"=>}}]]></description></item><item><title>Earning to give: an annotated bibliography</title><link>https://stafforini.com/notes/earning-to-give-an-annotated-bibliography/</link><pubDate>Sat, 22 Mar 2014 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/earning-to-give-an-annotated-bibliography/</guid><description>&lt;![CDATA[A while ago, I did a quick survey of the literature on earning to give—the pursuit of a high-earning career with the express purpose of donating a large portion of one's earnings to high-impact charities. Given the recent interest in the topic, I thought I should turn those notes into a proper bibliography. If I'm missing anything, please let me know.
{{< cite= "Andreev2013MaximizingDonationsVia"=>}}
Chronicles the author's experience in finding a job as a software engineer with the goal of earning to give.
{{< cite= "Brooks2013WayProducePerson"=>}}
If your profoundest interest is dying children in Africa or Bangladesh, it's probably best to go to Africa or Bangladesh, not to Wall Street.
{{< cite= "Carter2013VocationEarningGive"=>}}
Working to fund one's philanthropic ventures is certainly noble. But we shouldn't downplay the value of the income-generating work just because we can't see as directly how it helps others.
{{< cite= "Farquhar2012ReplaceabilityEffectWorking"=>}}
When we look at the consequences of our actions, and consider whether to take a job in a harmful industry, the harm of our taking the job is somewhat less than it first appears. There is still a harm, though, so you shouldn't take the job unless you think you can do something pretty good with it.
{{< cite= "Farquhar2012CollectiveActionWorking"=>}}
You need to pay attention to what other EAs are doing. But it doesn't mean that we should always avoid working in harmful industries, or thinking in general about how to individually make the most difference.
{{< cite= "Farquhar2012UniversalisabilityImmoral"=>}}
We recommend earning to give only because we look at the way the world is and we reckon it makes a positive difference. If the world became different, and lots of people naturally decided to do earning to give, we'd recommend something else.
{{< cite= "Hallquist2014WhyEarnGive"=>}}
An engaging, informal introduction to earning to give. Recommended.
{{< cite= "Hoskin2013HowMuchTaxes"=>}}
Suppose you're looking to donate as much as possible to charity, and are choosing between two jobs. Should you worry about the taxes in each location?
{{< cite= "Hurford2013WhatEarningGive"=>}}
A survey of the field.
{{< cite= "Karnofsky2013OurTakeEarning"=>}}
We're excited about "earning to give" as one option among many.
{{< cite= "Kaufman2011HowMuchShouldYou"=>}}
Earn and give as much as you can for the level of personal suffering you are prepared to accept.
{{< cite= "Kaufman2011WhatAboutNonWork"=>}}
Even in your spare time, which you usually can't turn into money to donate by working additional hours, you should still not engage in local charitable activities.
{{< cite= "Kaufman2012ProfessionalPhilanthropy"=>}}
A brief discussion of the convenience of using that expression, before 'earning to give' had became established.
{{< cite= "Kaufman2012HistoryEarningGive"=>}}
Credits Brian Tomasik with the first formulation of the idea.
{{< cite= "Kaufman2013SummariesEarningGive"=>}}
Earning to give involves four main ideas: (1) donate; (2) donate to the most effective organizations; (3) earn more so you can give more; (4) spend less so you can give more.
{{< cite= "Kaufman2013ArguingAboutBanking"=>}}
Examples of people in clearly beneficial jobs like Boris Yakubchik (high school math teacher) and Julia Wise (social worker at a prison) are both much less controversial and much more attainable for the typical reader.
{{< cite= "Kaufman2013HistoryEarningGiveII"=>}}
Quotes an exchange between Singer and an early proponent of earning to give.
{{< cite= "Kaufman2013HistoryEarningGiveIII"=>}}
Claims that John Wesley, founder of the Methodist movement, was an early advocate of earning to give.
{{< cite= "Kaufman2016EarningGive"=>}}
Earning to give is a career path that is well suited to people who are good at earning money, who are still exploring cause areas, who prioritize interventions that are funding-limited, who are early in their careers and want to build their skills, or who want to balance altruism against other things in their lives. I find that it suits me well, but I also can imagine myself doing something else five years from now.
{{< cite= "Kuhn2013DowngradingConfidenceEarning"=>}}
Concludes that (1) doing high-paying highly-skilled careers might be dominated by doing directly charitable things and that (2) effective altruists should probably be spreading a broader message.
{{< cite= "Kuhn2013CommonObjectionsEarning"=>}}
Discusses five objections to earning to give.
{{< cite= "MacAskill2011BankingEthicalCareer"=>}}
Altruistic bankers earn a lot, aren't likely to be replaceable, and can support the very best charities. They are likely to do more good than someone in an "ethical" career.
{{< cite= "MacAskill2012FollowingSchindlersFootsteps"=>}}
Uses Schindler's example to discuss the morality of working for an evil corporation.
{{< cite= "MacAskill2013SaveWorldDont"=>}}
Earning to give is often the best career option because of (1) discrepancy in earnings, (2) replaceability and (3) high variations in charity cost-effectiveness.
{{< cite= "MacAskill2014ReplaceabilityCareerChoice"=>}}
Defends the idea that deliberately pursuing a lucrative career in order to donate a large proportion of one's earnings is typically ethically preferable to a career within the charity sector.
{{< cite= "MacAskill201580000HoursThinks"=>}}
80,000 Hours never claimed that most people should earn to give; and now thinks that even fewer people should pursue this path to impact than it did before.
{{< cite= "MacAskill2016ShouldYouSwitch"=>}}
When considering whether to do direct work or earn to give, you could ask yourself: am I in the top 15% of people in terms of comparative advantage at earning to give?
{{< cite= "MacAskill2016BankingEthicalCareer"=>}}
Condensed version of 'Replaceability, career choice, and making a difference'.
{{< cite= "Matthews2013JoinWallStreet"=>}}
A popular, engaging piece, with profiles of many prominent advocates and practitioners of earning to give.
{{< cite= "Penalva2015QueGanarPara"=>}}
An engaging introduction for Spanish-speaking readers.
{{< cite= "Redwood2012FlatMarginEffect"=>}}
Argues that if you take a job that seems to have a strong (positive or negative) impact on the economy, the actual difference it makes to social welfare will be minimal.
{{< cite= "Salam2013RiseSingerians"=>}}
A criticism from a conservative perspective. Claims that people motivated by curiosity and novelty or a desire for recognition may have a much bigger positive impact than people who try to do good deliberately.
{{< cite= "Sinick2013EarningGiveAltruistic"=>}}
Responds to MacAskill's Quartz piece on earning to give.
{{< cite= "Shulman2012EntrepreneurshipGamePoker"=>}}
It would be a mistake to think of the returns to entrepreneurship as predictably stemming from just showing up and taking a spin at the wheel of startup roulette. Instead, entrepreneurship is more like poker: a game where even the best players cannot predictably win over a single night, but measurable differences predict that some will earn much more than others on average.
{{< cite= "Shulman2012SoftwareEngineeringBritain"=>}}
How attractive is the software industry for those who want to make money and use it to do good? In some ways, the British statistics are misleading, but they also reflect a real difference: software engineers in the US, and especially Silicon Valley, really are better compensated. The post lays out the supporting data, and discusses ways people outside the United States can make their way to Silicon Valley.
{{< cite= "Shulman2012SalaryStartupHow"=>}}
Altruists have stronger reasons to pursue risky careers because the standard arguments for risk aversion do not apply.
{{< cite= "Todd2013ShowMeHarm"=>}}
Makes some very rough estimates of how harmful finance would have to be in order for its harm to outweigh the good realized by the donations of someone who earns to give.
{{< cite= "Todd2013ComparisonMedicalResearch"=>}}
Earning to give in finance is slightly better than medical research.
{{< cite= "Todd2014HowMuchPeople"=>}}
Attemps to estimate how much people pursuing earning to give donate, how much they can be expected to donate in the immediate future, and how much extra giving was caused by 80,000 Hours.
{{< cite= "Tomasik2006WhyActivistsShould"=>}}
A pioneering essay.
{{< cite= "Xodarap2014PoliticalSkillsWhich"=>}}
An annotated bibliography of a few recent meta-analyses of predictors of income.
_With thanks to Imma Six._]]></description></item><item><title>My beliefs</title><link>https://stafforini.com/notes/my-beliefs/</link><pubDate>Thu, 05 Feb 2015 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/my-beliefs/</guid><description>&lt;![CDATA[My friend Brian Tomasik recently made available [a table](http://reducing-suffering.org/summary-beliefs-values-big-questions/) summarizing his beliefs on various issues. I thought recording my own credences on these propositions would be a fun and potentially instructive exercise, and decided to make my answers public. To prevent myself from being influenced by Brian's responses, I copied the original table, pasted in on an Excel spreadsheet, deleted the column with his answers, and randomized the rows by sorting them in alphabetical order. Only after recording all my responses did I allow myself to look at Brian's, and I managed to resist the temptation to make any changes _ex post facto_. Overall, I was pleasantly surprised at the degree to which we agree, and not very surprised at the areas where we don't agree. I also suspect that a few of our disagreements (e.g. on compatibilism about free will) are [merely verbal](http://consc.net/papers/verbal.pdf). Below, I comment on the propositions on which we disagree the most.
If you think you might want to participate in this exercise, you can make a duplicate of [this spreadsheet](https://docs.google.com/spreadsheets/d/14hKOHhLTyD1DkYPTdUriRGW2oLPfvxRic1FtzUXg6Ds/edit) and record your answers there before reading any further.
**Update**: See also Michael Dickens' responses in the comments section.
**Update 2**: There is now [a new version](/notes/my-beliefs-updated/) of the table below, which reflects my beliefs as of November 2020.
| **Belief** | **Brian** | **Pablo** |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------|-----------|
| "Aesthetic value: objective or subjective?" Answer: subjective | 99.50% | 99.90% |
| Artificial general intelligence (AGI) is possible in principle | 99% | 95% |
| Compatibilism on free will | 98% | 5% |
| "Abstract objects: Platonism or nominalism?" Answer: nominalism | 97% | 90% |
| Moral anti-realism | 96% | 20% |
| Humans will eventually build human-level AGI conditional on no other major intervening disruptions to civilization as we know it | 90% | 90% |
| We live in at least a Level I multiverse | 85% | 60% |
| Type-A physicalism regarding consciousness | 75% | 25% |
| Eternalism on philosophy of time | 75% | 98% |
| Earth will eventually be controlled by a singleton of some sort | 72% | 70% |
| Human-inspired colonization of space will cause net suffering if it happens | 71% | 1% |
| Many worlds interpretation of quantum mechanics (or close kin) | 70% | 70% |
| Soft AGI takeoff | 70% | 55% |
| By at least 10 years before human-level AGI is built, debate about AGI risk will be as mainstream as global warming is in 2015 | 67% | 70% |
| Human-controlled AGI would result in less expected suffering than uncontrolled, as judged by the beliefs I would hold if I thought about the problem for another 10 years | 65% | 5% |
| A government will build the first human-level AGI, assuming humans build one at all | 62% | 60% |
| Climate change will cause net suffering | 60% | 50% |
| By 2100, if biological humans still exist, most of them will regard factory farming as a great evil of the past | 60% | 10% |
| The effective-altruism movement, all things considered, reduces rather than increases suffering in the far future | 60% | 40% |
| Electing more liberal politicians reduces net suffering in the far future | 57% | 60% |
| Faster technological innovation increases net suffering in the far future | 55% | 55% |
| "Science: scientific realism or scientific anti-realism?" Answer: realism | 60% | 90% |
| At bottom, physics is digital | 50% | — |
| Cognitive closure of some philosophical problems | 50% | 80% |
| Rare Earth explanation of Fermi Paradox | 50% | 25% |
| Crop cultivation prevents net suffering | 50% | 50% |
| Conditional on a government building the first human-level AGI, it will be the USA (rather than China, etc.) | 50% | 65% |
| Earth-originating intelligence will colonize the entire galaxy (ignoring anthropic arguments) | 50% | 10% |
| Faster economic growth will cause net suffering in the far future | 43% | 55% |
| Whole brain emulation will come before de novo AGI, assuming both are possible to build | 42% | 65% |
| Modal realism | 40% | 1% |
| The multiverse is finite | 40% | 70% |
| A world government will develop before human-level AGI | 35% | 15% |
| Wild-animal suffering will be a mainstream moral issue by 2100, conditional on biological humans still existing | 15% | 8% |
| Humans will go extinct within millions of years for some reason other than AGI | 10% | 10% |
| A design very close to CEV will be implemented in humanity's AGI, conditional on AGI being built (excluding other value-learning approaches and other machine-ethics proposals) | 5% | 10% |
**Values**
| **Value system** | **Brian** | **Pablo** |
|-------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------|-----------|
| Negative utilitarianism focused on extreme suffering | 90% | 1% |
| Ethical pluralism for other values (happiness, love, friendship, knowledge, accomplishment, diversity, paperclips, and other things that agents care about) | 10% | 0.01% |
| **The kind of suffering that matters most is…** | **Brian** | **Pablo** |
|-------------------------------------------------|-----------|-----------|
| hedonic experience | 70% | 99% |
| preference frustration | 30% | 1% |
Comments {#comments}
_Compatibilism on free will_
If by 'free will' we mean what most people mean by that expression, then I'm almost certain that we don't have free will, and that free will is incompatible with determinism. Brian appears to believe otherwise because he thinks that the meaning of 'free will' should ultimately be governed by instrumental considerations. I think this approach fosters muddled thinking and facilitates intellectual dishonesty.
_Moral anti-realism_
This is one in a series of very strong disagreements about questions in metaethics. I think some things in the world—pleasant states of consciousness—objectively have value, regardless of whether anyone desires them or has any other positive attitude towards them. This value, however, is realized in conscious experience and as such exists in the natural world, rather than being _sui generis_, as most moral realists believe. So the main arguments against moral realism, such as Mackie's argument from queerness, do not apply to the view I defend.
_Type-A physicalism regarding consciousness_
I'm somewhat persuaded by David Chalmers' arguments for dualism, though I discount the apparent force of this evidence by the good track-record of physicalist explanations in most other domains. About half of my probability mass for physicalism is concentrated on type-B physicalism—hence my relatively low credence on the type of physicalism that Brian favors.
_Eternalism on philosophy of time_
I assume Brian just meant belief in the static, or _tensless_, theory of time (what McTaggart called the 'B theory'), rather than some subtler view about temporal becoming. If so, I think the arguments against the dynamic or tensed theory are roughly as strong as the arguments against free will: both of these views are primarily supported by introspection ("it seems time flows", "it seems I'm free") and opposed by naturalism, and the latter is much better supported by the evidence.
_Human-inspired colonization of space will cause net suffering if it happens_
I'm puzzled by Brian's answer here. I think it's quite unlikely that the future will contain a preponderance of suffering over happiness, since it will be optimized by agents who strongly prefer the latter and will have driven most other sentient lifeforms extinct. But maybe Brian meant that colonization will cause a surplus of suffering relative to the amount present before colonization. I think this is virtually certain; I'd give it a 99% chance.
_Human-controlled AGI would result in less expected suffering than uncontrolled, as judged by the beliefs I would hold if I thought about the problem for another 10 years_
Uncontrolled AGI will likely result in no suffering at all, whereas human-controlled AGI will result in some suffering and much more happiness. Brian thinks the goal systems that uncontrolled AGI might create will resemble paradigmatic sentient beings enough to count themselves as sentient, but I do not share his radically subjectivist stance concerning the nature of suffering (on which something counts as suffering roughly if, and to the degree that, we decide to care about it).
_By 2100, if biological humans still exist, most of them will regard factory farming as a great evil of the past_
I slightly misread the sentence, which I took to state that most future people will regard factory farming as _the_ great evil of the past. Given the high number of alternative candidates (from a non-enlightened, common-sense human morality), I thought it unlikely that our descendants would single out factory farming as the single greatest evil in human history. But it's much more likely (~60%) that they will regard it as _a_ great evil, especially if mass production of _in vitro_ meat displaces factory farms (and hence removes the main cognitive barrier to widespread recognition that all sentient beings matter morally, and matter equally).
_Science: scientific realism or scientific anti-realism?" Answer: realism_
I think the best explanation of the extraordinary success of science is that it describes the world as it really is; on anti-realism, this success is a mystery. So I think realism is much more likely.
_The effective-altruism movement, all things considered, reduces rather than increases suffering in the far future_
I think the EA movement somewhat increases the probability of far-future suffering by increasing the probability that such a future will exist at all, to a greater degree than it reduces the suffering of far-future sentients conditional on their existence.
(Note that I believe the EA movement _decreases_ the overall probability of _net_ suffering in the far future. That is, EA will cause, in expectation, more suffering, but even more happiness.)
_Cognitive closure of some philosophical problems_
Reflection about our evolutionary origins makes it quite likely that we lack the cognitive machinery necessary for solving at least some such problems. We are probably among the stupidest intelligent species possible, since we were the first to fill the niche and we haven't evolved much since. (But see [Greg Egan](http://metamagician3000.blogspot.com.ar/2009/09/interview-with-greg-egan.html) for an opposing perspective.)
_Earth-originating intelligence will colonize the entire galaxy (ignoring anthropic arguments)_
I think extinction before colonization is quite likely; ~20% this century, and we have a long way to go.
_Rare Earth explanation of Fermi Paradox_
Very low confidence in my answer here; Brian is probably closer to the truth than I am.
_Modal realism_
Upon reflection, I think I was overconfident on this one; 5-10% seems like a more reasonable estimate. The main reason to believe modal realism is that it would provide an answer to the [ultimate question of existence](/notes/why-does-the-world-exist-a-bibliography/). However, the existence of nothing would have been an even more natural state of affairs than the existence of everything. Since this possibility manifestly does not obtain, that reduces the probability of modal realism (see {{< cite= "Grover1998CosmologicalFecundity"=>}}).
_Negative utilitarianism focused on extreme suffering_
I see no reason to deviate from symmetry: suffering of a given intensity is as bad as happiness of that intensity is good; and suffering twice as intense is (only) twice as bad. Brian thinks there isn't a natural intensity matching between happiness and suffering, but I disagree (and trust my judgment here).
_Ethical pluralism for other values (happiness, love, friendship, knowledge, accomplishment, diversity, paperclips, and other things that agents care about)_
I just cannot see how anything but experience could have value. As I view things, the choice is between some form of experientialism and nihilism, _tertium non datur_.
_The kind of suffering that matters most is... preference frustration_
Again, I have difficulty understanding how merely satisfying a preference, in the absence of any conscious _experience of satisfaction_, could matter at all morally. Preference theories are also beset by a number of problems; for example, it's unclear how they can deal with preferences whose objects are spatially or temporally very removed from the subject, without either restricting the class of relevant preferences arbitrarily or implying gross absurdities.
_With thanks to Brian Tomasik, Peter McIntyre, and others for valuable discussion._]]></description></item><item><title>My beliefs, updated</title><link>https://stafforini.com/notes/my-beliefs-updated/</link><pubDate>Sun, 08 Nov 2020 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/my-beliefs-updated/</guid><description>&lt;![CDATA[Back in 2015, I published [a post](/notes/my-beliefs/) listing my beliefs on various propositions. This post updates that list to reflect what I currently believe. The new table also has a new column, indicating the resilience of each belief, defined as the likelihood that my credences will change if I thought more about the topic.
Note that, although the credences stated in the 2015 post are outdated, the substantive comments included there still largely reflect my current thinking. Accordingly, you may still want to check out that post if you are curious about why I hold these beliefs to the degree that I do.
| **Proposition** | **Credence** | **Resilience** |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------|----------------|
| "Aesthetic value: objective or subjective?" Answer: subjective | 100% | high |
| Artificial general intelligence (AGI) is possible in principle | 95% | highish |
| Compatibilism on free will | 10% | highish |
| "Abstract objects: Platonism or nominalism?" Answer: nominalism | 95% | highish |
| Moral anti-realism | 30% | medium |
| Humans will eventually build human-level AGI conditional on no other major intervening disruptions to civilization as we know it | 85% | highish |
| We live in at least a Level I multiverse | 30% | low |
| Type-A physicalism regarding consciousness | 10% | medium |
| Eternalism on philosophy of time | 95% | medium |
| Earth will eventually be controlled by a singleton of some sort | 60% | medium |
| Human-inspired colonization of space will cause net suffering if it happens | 10% | highish |
| Many worlds interpretation of quantum mechanics (or close kin) | 55% | lowish |
| Soft AGI takeoff | 60% | lowish |
| By at least 10 years before human-level AGI is built, debate about AGI risk will be as mainstream as global warming is in 2015 | 55% | medium |
| Human-controlled AGI would result in less expected suffering than uncontrolled, as judged by the beliefs I would hold if I thought about the problem for another 10 years | 5% | highish |
| A government will build the first human-level AGI, assuming humans build one at all | 35% | lowish |
| Climate change will cause net suffering | 55% | lowish |
| By 2100, if biological humans still exist, most of them will regard factory farming as a great evil of the past | 70% | medium |
| The effective-altruism movement, all things considered, reduces rather than increases suffering in the far future [NB: I think EA very likely reduces _net_ suffering in expectation; see comments [here](/notes/my-beliefs/).] | 35% | medium |
| Electing more liberal politicians reduces net suffering in the far future | 55% | lowish |
| Faster technological innovation increases net suffering in the far future | 50% | medium |
| "Science: scientific realism or scientific anti-realism?" Answer: realism | 95% | highish |
| At bottom, physics is digital | 15% | low |
| Cognitive closure of some philosophical problems | 80% | medium |
| Rare Earth explanation of Fermi Paradox | 10% | lowish |
| Crop cultivation prevents net suffering | 50% | medium |
| Conditional on a government building the first human-level AGI, it will be the USA (rather than China, etc.) | 45% | medium |
| Earth-originating intelligence will colonize the entire galaxy (ignoring anthropic arguments) | 15% | medium |
| Faster economic growth will cause net suffering in the far future | 50% | medium |
| Whole brain emulation will come before de novo AGI, assuming both are possible to build | 40% | medium |
| Modal realism | 20% | lowish |
| The multiverse is finite | 70% | low |
| A world government will develop before human-level AGI | 10% | highish |
| Wild-animal suffering will be a mainstream moral issue by 2100, conditional on biological humans still existing | 15% | medium |
| Humans will go extinct within millions of years for some reason other than AGI | 40% | medium |
| A design very close to CEV will be implemented in humanity's AGI, conditional on AGI being built (excluding other value-learning approaches and other machine-ethics proposals) | 5% | medium |
| Negative utilitarianism focused on extreme suffering | 1% | highish |
| Ethical pluralism for other values (happiness, love, friendship, knowledge, accomplishment, diversity, paperclips, and other things that agents care about) | 0% | high |
| hedonic experience is most valuable | 95% | high |
| preference frustration is most valuable | 3% | high |]]></description></item><item><title>John McTaggart Ellis McTaggart: a bibliography</title><link>https://stafforini.com/notes/john-mctaggart-ellis-mctaggart-a-bibliography/</link><pubDate>Sat, 20 Dec 2014 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/john-mctaggart-ellis-mctaggart-a-bibliography/</guid><description>&lt;![CDATA[{{< figure= src="/ox-hugo/john-mctaggart-portrait.jpg" alt="John McTaggart Ellis McTaggart">}}
If we compare McTaggart with the other commentators on Hegel we must admit that he has at least produced an extremely lively and fascinating rabbit from the Hegelian hat, whilst they have produced nothing but consumptive and gibbering chimeras. And we shall admire his resource and dexterity all the more when we reflect that the rabbit was, in all probability, never inside the hat, whilst the chimeras perhaps were.
{{< cite= "Broad1930DogmasReligion"= "p.= xxxi"=>}}
1892 {#1892}
- {{< cite= "McTaggart1892ChangesMethodHegel"=>}}
- {{< cite= "McTaggart1892ChangesMethodHegela"=>}}
- {{< cite= "McTaggart1892ReviewThomasMackay"=>}}
1893 {#1893}
- {{< cite= "McTaggart1893FurtherDeterminationAbsolute"=>}}
- {{< cite= "McTaggart1893TimeHegelianDialectic"=>}}
1894 {#1894}
- {{< cite= "McTaggart1894TimeHegelianDialectic"=>}}
1895 {#1895}
- {{< cite= "McTaggart1895NecessityDogma"=>}}
- {{< cite= "McTaggart1895ReviewWallaceHegel"=>}}
1896 {#1896}
- {{< cite= "McTaggart1896StudiesHegelianDialectic"=>}}
- {{< cite= "McTaggart1896HegelTheoryPunishment"=>}}
1897 {#1897}
- {{< cite= "McTaggart1897ReviewNoelLogique"=>}}
- {{< cite= "McTaggart1897HegelTreatmentCategories"=>}}
- {{< cite= "McTaggart1897HegelTreatmentCategoriesa"=>}}
- {{< cite= "McTaggart1897ConceptionSocietyOrganism"=>}}
- {{< cite= "McTaggart1897ReviewWatsonChristianity"=>}}
1899 {#1899}
- {{< cite= "McTaggart1899HegelTreatmentCategories"=>}}
1900 {#1900}
- {{< cite= "McTaggart1900CriticalNoticeJosiah"=>}}
- {{< cite= "McTaggart1900HegelTreatmentCategories"=>}}
- {{< cite= "McTaggart1900ReviewIngeChristian"=>}}
- {{< cite= "McTaggart1900ReviewWilsonTwo"=>}}
1901 {#1901}
- {{< cite= "McTaggart1901StudiesHegelianCosmology"=>}}
1902 {#1902}
- {{< cite= "McTaggart1902CriticalNoticeHowison"=>}}
- {{< cite= "McTaggart1902ReviewJoachimStudy"=>}}
- {{< cite= "McTaggart1902HegelTreatmentCategories"=>}}
- {{< cite= "McTaggart1902ReviewJosiahRoycea"=>}}
1903 {#1903}
- {{< cite= "McTaggart1903ConsiderationsRelatingHuman"=>}}
- {{< cite= "McTaggart1903ReviewJohnGrier"=>}}
- {{< cite= "McTaggart1903ReviewTennantOrigin"=>}}
1904 {#1904}
- {{< cite= "McTaggart1904HegelTreatmentCategories"=>}}
- {{< cite= "McTaggart1904ReviewRobertAdamson"=>}}
- {{< cite= "McTaggart1904HumanPreexistence"=>}}
- {{< cite= "McTaggart1904ReviewAspectsVedanta"=>}}
1906 {#1906}
- {{< cite= "McTaggart1906DogmasReligion"=>}}
1907 {#1907}
- {{< cite= "McTaggart1907CriticalNoticeOrmond"=>}}
1908 {#1908}
- {{< cite= "McTaggart1908CriticalNoticeWilliam"=>}}
- {{< cite= "McTaggart1908IndividualismValue"=>}}
- {{< cite= "McTaggart1908UnrealityTime"=>}}
1909 {#1909}
- {{< cite= "McTaggart1909RelationTimeEternity"=>}}
- {{< cite= "McTaggart1909ReviewEdwardWestermarck"=>}}
1910 {#1910}
- {{< cite= "McTaggart1910CommentaryHegelLogic"=>}}
- {{< cite= "McTaggart1910DareBeWise"=>}}
1912 {#1912}
- {{< cite= "McTaggart1912ReviewBosanquetPrinciple"=>}}
- {{< cite= "McTaggart1912ReviewGeorgLasson"=>}}
1913 {#1913}
- {{< cite= "McTaggart1913ReviewAdolfPhalen"=>}}
1915 {#1915}
- {{< cite= "McTaggart1915MeaningCausality"=>}}
1916 {#1916}
- {{< cite= "McTaggart1916HumanImmortalityPreExistence"=>}}
1921 {#1921}
- {{< cite= "McTaggart1921NatureExistence"=>}}
- {{< cite= "McTaggart1921ImmortalityMonadisticIdealism"=>}}
1923 {#1923}
- {{< cite= "McTaggart1923ReviewSethPringlePattison"=>}}
- {{< cite= "McTaggart1923PropositionsApplicableThemselves"=>}}
1925 {#1925}
- {{< cite= "McTaggart1925ReviewXavierLeon"=>}}
1927 {#1927}
- {{< cite= "McTaggart1927TheNatureExistence"=>}}
1934 {#1934}
- {{< cite= "McTaggart1934PhilosophicalStudies"=>}}
1938 {#1938}
- {{< cite= "Keeling1938McTaggartNatureExistence"=>}}]]></description></item><item><title>Why does the world exist? — A bibliography</title><link>https://stafforini.com/notes/why-does-the-world-exist-a-bibliography/</link><pubDate>Thu, 27 Feb 2014 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/why-does-the-world-exist-a-bibliography/</guid><description>&lt;![CDATA[That anything should exist at all does seem to me a matter for the deepest awe.
J. J. C. Smart
Nicht wie die Welt ist, ist das Mystische, sondern dass sie ist.
Ludwig Wittgenstein
The most important question of all is also one of the most neglected. The list below includes what are, to the best of my knowledge, the only scholarly writings in the contemporary literature that directly address the question of why the universe exists. (That question includes two sub-questions, 'Why is there anything at all?' and 'Why does this particular world exist, rather than some other', which we may call the "general" and the "special" ultimate questions of existence, respectively.) Note that this excludes non-scholarly writings (such as Jim Holt's _Why does the world exist?_); writings that address the question indirectly (such as those found in certain areas within physical cosmology and the philosophy of religion); and writings too removed from the present (published before, say, 1850, such as Leibniz's _De rerum originatione radicali_). My personal recommendations are **boldfaced**.
If you think I'm missing something, please let me know.
- {{< cite= "Armour1987ValuesGodProblem"=>}}
- {{< cite= "Blackburn2009Philosophy"= "pp.= 132–141"=>}}
- {{< cite= "Brooks1992WhyThisWorld"=>}}
- {{< cite= "Burke1984HumeEdwardsWhy"=>}}
- {{< cite= "Carlson2001PresumptionNothingness"=>}}
- {{< cite= "Carroll2018WhyThereSomething"=>}}
- {{< cite= "Conee2005RiddlesExistence"= "ch.= 5"=>}}
- {{< cite= "Davies2003WhyThereAnything"=>}}
- {{< cite= "Edwards1967Why"=>}}
- {{< cite= "Fleming1988WhyThereSomething"=>}}
- {{< cite= "Goldschmidt2013PuzzleExistenceWhy"=>}}
- {{< cite= "Goldstick1979WhyThereSomething"=>}}
- **{{< cite= "Grover1998CosmologicalFecundity"=>}}**
- {{< cite= "Grunbaum1993CreationCosmology"=>}}
- {{< cite= "Grunbaum2009WhyThereWorld"=>}}
- {{< cite= "Heller2009UltimateExplanationsUniverse"=>}}
- {{< cite= "Heylen2017WhyThereSomething"=>}}
- {{< cite= "Knight1956WhyNotNothing"=>}}
- {{< cite= "Kuhn2007WhyThisUniverse"=>}}
- {{< cite= "Kuhn2007CloserTruthScience"= "pp.= 235–258"=>}}
- {{< cite= "Kuhn2013WhyNotNothing"=>}}
- {{< cite= "Kusch1990WhyThereSomething"=>}}
- {{< cite= "James1911ProblemBeing"=>}}
- {{< cite= "Leslie1978EffortsExplainAll"=>}}
- {{< cite= "Leslie1979ValueExistence"=>}}
- {{< cite= "Leslie2005ReviewBedeRundle"=>}}
- {{< cite= "Leslie2009WhyThereSomething"=>}}
- {{< cite= "Leslie2009CosmosExistingEthical"=>}}
- {{< cite= "Leslie2013MysteryExistenceWhy"=>}}
- {{< cite= "Lowe1996WhyThereAnything"=>}}
- {{< cite= "Maitzen2012StopAskingWhy"=>}}
- {{< cite= "Mann2009PuzzleExistence"=>}}
- {{< cite= "Mawson2009WhyThereAnything"=>}}
- {{< cite= "Mortensen1986ExplainingExistence"=>}}
- {{< cite= "Munitz1965MysteryExistenceEssay"=>}}
- {{< cite= "Nagel2010WhyThereAnything"=>}}
- {{< cite= "Ng2019EvolvedGodCreationismView"=>}}
- **{{< cite= "Nozick1981WhyThereSomething"=>}}**
- {{< cite= "Parfit1991WhyDoesUniverse"=>}}
- {{< cite= "Parfit1992PuzzleRealityWhy"=>}}
- **{{< cite= "Parfit1998WhyAnythingWhy"=>}}, {{< cite= "Parfit1998WhyAnythingWhy2"=>}}**
- {{< cite= "Post1991WhyDoesAnything"=>}}
- {{< cite= "Rescher1984RiddleExistenceEssay"=>}}
- {{< cite= "Rescher2000OptimalismAxiologicalMetaphysics"=>}}
- {{< cite= "Rundle2004WhyThereSomething"=>}}
- {{< cite= "Sarkar1993SomethingNothingExplanation"=>}}
- {{< cite= "Schlesinger1998EnigmaExistence"=>}}
- {{< cite= "Schuppe2013WarumUberhauptEtwas"=>}}
- {{< cite= "Smith1997SimplicityWhyUniverse"=>}}
- {{< cite= "Smith1999ReasonUniverseExists"=>}}
- {{< cite= "Unger1984MinimizingArbitrarinessMetaphysics"=>}}
- {{< cite= "vanInwagen1996WhyThereAnything"=>}}
- {{< cite= "Wippel2011UltimateWhyQuestion"=>}}
- {{< cite= "Witherall2001FundamentalQuestion"=>}}
- {{< cite= "Witherall2002ProblemExistence"=>}}
_In memoriam, Quentin Smith (1952-2020)_
{{< figure= src="/ox-hugo/quentin-smith-portrait.jpg" alt="Quentin Smith">}}]]></description></item><item><title>My 1996 Winnebago Rialta is for sale</title><link>https://stafforini.com/notes/my-1996-winnebago-rialta-is-for-sale/</link><pubDate>Wed, 14 Mar 2018 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/my-1996-winnebago-rialta-is-for-sale/</guid><description>&lt;![CDATA[{{< figure= src="/ox-hugo/rialta-featured.jpeg" alt="1996 Winnebago Rialta parked on a residential street">}}
**_Note: The Rialta is now sold._**
My 1996 Winnebago Rialta is for sale. This is the famous RV that, until last year, belonged to Tynan, author of the classic _The Tiniest Mansion: How To Live In Luxury on the Side of the Road in an RV_. Although I would love to keep it, I currently live in the UK and transporting the vehicle to Europe is a major logistical, financial, and bureaucratic hassle.
I only drove the Rialta 500 miles since buying it from the previous owner, and nothing has changed relative to the state it was when I purchased it, except that the following improvements worth $4,000 have been made: new tires, including the spare, getting new spark plugs, tuning it up, and changing the fluids. I'm selling it for the same price I bought it for, which is a good deal given that it now has these mechanical things fixed up.
What follows is Tynan's [original announcement](http://tynan.com/rialta), which provides a comprehensive description of all the tweaks Tynan made to the vehicle over the years. If you have any questions, feel free to [get in touch](/notes/contact/).
[Youtube video](//www.youtube.com/embed/BjiQFCunJqk?rel=0;vq=hd720;rel=0;showinfo=0;modestbranding=1;autohide=1&controls=0)
If you've been thinking about the RV lifestyle and want to live in the RV that started the Rialta craze, here's your chance. Or if you just want a cool RV or a pied-a-terre in San Francisco, this could be for you.
Also included is a parking spot in San Francisco, if you want it. It's probably the only good RV spot in San Francisco and includes water and electricity. I also have a waste-dumping system that allows you to dump your waste without moving the RV from the parking spot. You'll have to negotiate the rate with the owner of the spot (a friend of mine), but he offered that if someone pays my asking price they can have the spot at an artificially low rate of $550/mo. He's a really nice guy. Location is in a prime area of the Castro with easy subway connection to downtown in 5 minutes. Nearby studios rent for $3500+, so the RV pays for itself in less than a year.
The RV itself is a 1996 Rialta Winnebago. This is the best RV to live in because it's much wider than Sprinter vans and similar. Mine has the rare floorplan that has a full-sized bed, which makes it very comfortable.
The RV itself has around 85k miles. I'm not in San Francisco so I can't check currently. I replaced the top half of the engine as well as the transmission, which cost me around $12k total.
I've done a tremendous amount of work to the RV, all without regard to price or effort to install. Below is a nearly-complete list of the features, though I'm sure I've forgotten some stuff.
Design {#design}
I installed maple floors in the main area, and black and white marble mosaic in the entryway. The ceiling is a real metal tin ceiling in gold tone. The curved transition area between the wall and ceiling is real 20k gold leaf, which is lit by LEDs hidden in the molding. Custom curtains were made from antique kimono fabric from Japan. Some of the windows are covered in mulberry shoji paper, but some have small tears now. Easy to remove or fix.
Windows on the driver's side of the RV are day-night shades custom made to size, with magnets to keep them in place.
I bought tatami mats and sized them to fit the bed platform so that I can roll it up and have a tea room. I now build tea rooms everywhere (Vegas, building island this summer), and you can own the first one I built.
The countertops were custom fabricated from high end Brazilian granite. A deep sink with new faucet was put in. The cabinet handles were replaced with brass and all cabinets were painted.
The desk is a custom made tigerwood desk with a low-profile drawer. Under it is a 100 year old persian rug which I will include if you want it, but would like to keep if you don't really care.
The closet was converted into a cedar closet, as was one of the drawers.
Lighting {#lighting}
You get a fully automated light system that I coded, which also controls stereo and fan. The interface is basic, but it has powerful capabilities which you could expand). It runs off of a raspberry pi and GC100 IR/relay controller. I've installed LED strips just about everywhere in two different zones that can be independently controlled. I made two little art lights to shine an LED spot on the wall where I hung art. The desk lamp is also controllable. My software allows you to white balance lights, create scenes, fade over time, etc.
There's also a backup set of lights that you can control just by pushing buttons.
AV Equipment {#av-equipment}
The head unit for the stereo is a Clarion with a flip out screen. It has full navigation with lane changing, is satellite ready, has an installed backup camera, etc. It's paired with an Alpine amplifier and upgraded speakers. I also built a custom subwoofer enclosure into the bed which has a JL 12" subwoofer. This system sounds amazing.
Mounted overhead is a small projector which projects onto an easy-to-deploy screen in the middle of the RV. You can turn the captains chairs around and watch like it's a movie theater, or you can flip the image and watch from bed. The power switch is mounted in the wall.
Over the desk is a 27" TV that I use as a second monitor. An HDMI switch allows you to output your laptop to either the TV or the projector. The 27" TV is on an extendable arm mounted to the metal frame of the RV so that it can be fully extended.
Kitchen {#kitchen}
The fridge is a Dometic DC fridge which runs off batteries and uses an average of 10W throughout the day. It even has a freezer that's cold enough to make ice. The range is a household two-burner gas range. I forget the brand, but it's a high-end unit meant for real cooking and has far higher output than most RV stoves.
Above the range are a bunch of magnetic spice boxes that adhere to the curved tin ceiling that comes down to the wall.
The sink is deep and includes a pullout faucet. It also has a cutting board that fits into it. Beside the regular sink is a second tap for filtered water which goes through a large three stage filter.
Power {#power}
On the roof are two 200W solar panels with low profile mounts. The solar controller is a Blue Sky 3024i MPPT controller with a monitoring panel and a shunt for measuring the power. The inverter and battery charger is a Xantrex Prosine 2.0. I custom wired everything so that you can use all of the outlets on either the inverter or shore power. This system is ridiculous overkill that would be more appropriate for a cabin than an RV.
The batteries are both dead and should be replaced. You could go really cheap if you don't plan on being off-grid, or you could get lithium ion batteries that would last a week without sun.
There's also a SeeLevel monitor which shows battery charge, LP gas levels, and fresh water levels down to the exact percentage.
Climate Control {#climate-control}
The RV comes with an Olympian Wave 6 propane heater with auto shutoff. This feels like a fireplace and gets really hot if you put it on high. It has a flexible house and a quick-detach mount point in the kitchen.
Also installed is a 400W ceramic panel. Unless you get lithium ion batteries, you should run this only when you're plugged in. It's silent and located under the desk so you can be toasty while you work.
In the ceiling is a MaxxAir vent fan with variable speed control and reversible direction. It can be controlled by remote with thermostat or by my system. For some reason it's very hard to close now, so I just leave it open all the time.
I think that the AC in the car part of the RV might not work. I haven't used it in a long time so I don't remember.
Bathroom {#bathroom}
I replaced the old plastic toilet with a porcelain RV toilet with wooden seat and cover. It was virtually impossible to find one that would fit. The shower was modified to eject water outside (it used to go into the gray tank which would fill very quickly). The showerhead was replaced with a high-pressure low-flow oxygenics head.
Also included is a propane water heater that I never installed. The built in one uses electricity.
Miscellaneous {#miscellaneous}
The skylight was recently replaced with a brand new one, but it doesn't have trim around it.
There's a viper alarm with window break sensors and extra remotes. A friend tried to test it for me and said it didn't do anything, which I believe is because the remote batteries are dead.
I replaced the RV door with a brand new one with deadbolt and extra key.
Installed is a wireless router that can pull in other wifi signals and rebroadcast them within the RV.
I'm sure I've forgotten a lot of things about the RV as well as things that I'm throwing in because I bought them for the RV. Maybe there will be some cool surprises for you.
Registration + Smog {#registration-smog}
The RV is owned by a Wyoming Corporation and registered in South Dakota. I am actually selling you the Wyoming Corporation, not the RV. The practical implication of this is that you don't have to pay transfer tax since the owner of the RV is staying the same. South Dakota is the most common place to register RVs because you never have to get them smogged or inspected, and annual registration is low.
The Issues {#the-issues}
_Please note that most of these issues have been fixed since Tynan wrote the original post. As mentioned above, the following improvements have been made: "new tires, including the spare, getting new spark plugs, tuning it up, and changing the fluids."—Pablo_
I basically never drive the RV anymore because I have such a good parking space. I move it around a bit to dump the tanks and it always runs just fine. A year or so ago I drove across San Francisco to fill up the propane. But even though I don't know about any issues, it hasn't been seriously driven in years so I'd personally get it tuned up before any long trips.
The drawers are a little busted. I have a repair kit but haven't gotten around to installing it. The driver's side door doesn't open from the outside for some reason.
The RV is pretty dirty on the outside, but clean on the inside.
I did a lot of the work myself and while I'm competent, I'm not a finish carpenter. So maybe just decrease expectations for fit and finish by 10% and be pleasantly surprised in some cases.
{{< gallery= dir="images/rialta">}}
More info {#more-info}
- [Specs](/ox-hugo/1996_rialta_specs.pdf)
- [Brochure](/ox-hugo/1996Rialta.pdf)
- [Floor plan](/ox-hugo/floor_plan_22fd_double.gif)
Sources {#sources}
- [These young SF professionals choose to live in RVs](https://www.sfgate.com/bayarea/article/These-young-SF-professionals-choose-to-live-in-RVs-4778625.php), _SFGate_
- [The gambler, pick-up artist, blogger and Rialta](https://winnebagolife.com/2014/05/the-gambler-pick-up-artist-blogger-and-rialta), _WinnebagoLife_]]></description></item><item><title>Minimizing jet lag</title><link>https://stafforini.com/notes/minimizing-jet-lag/</link><pubDate>Sun, 25 Mar 2018 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/minimizing-jet-lag/</guid><description>&lt;![CDATA[This post lists what I believe are the most effective strategies to reduce the impact of jet lag. It evolved out of a document I wrote for a friend who sought my advice. A few of these tips are copied from Wiseman (2014); most of the other ones are based on a couple of hours of research using Google and Google Scholar.
Booking your flight {#booking-your-flight}
- _Timing._ Choose the time of the flight by following the simple adage, _Fly east, fly early. Fly west, fly late._
- _Seating._ If you need to sleep during the trip, (1) pick a window seat to avoid being disturbed by other passengers and (2) do not pick a a seat on the sunny side of the plane.
- For flights in the northern hemisphere, the sun will tend to be on the left side of the plane when you fly west, and on the right side when you go east. Check [SeatGuru](https://www.seatguru.com) for more info.
- _Airline._ Ideally, book your flight with [an airline that is rarely delayed](http://www.flightstats.com/company/monthly-performance-reports/airlines/).
- The most important consideration is the frequency of _long_ delays (+2 hours) and cancellations, since these are the most disruptive. I couldn't find statistics for these outside the US, but Silver (2015) notes that overall delay times generally correlate well with both long delay and cancellation frequency, so we can rely on this measure as an adequate proxy.
- _Aircraft._ Ideally, book your flight on a Dreamliner.
Before you fly {#before-you-fly}
- Insofar as you can, gradually shift your body clock to match your destination's, by going to bed one hour earlier or later each day, and shifting your wakeup time correspondingly.
- Calculate the time you should go to bed on each of the relevant days preceding your flight. For example, if the time at your destination is five hours later, you should go to bed one hour earlier five days before your departure, two hours earlier four days before, and so on.
- For each of these days, set an alarm on your phone to ring ~2 hours prior to bedtime. When the alarm rings, start wearing [orange-tinted glasses](https://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=orange+tinted+glasses&x=0&y=0), take melatonin, and resolve to go to bed ~2 hours later. (If you have a bedtime routine, shift this routine accordingly.)
- Expose yourself to lots of light soon after waking up, by either going outdoors or using a [blue-light lamp](https://www.amazon.com/gp/bestsellers/hpc/13053141/ref=zg_b_bs_13053141_1).
- The site [Jet Lag Rooster](http://www.jetlagrooster.com/) can help implement this advice.
During your flight {#during-your-flight}
- As soon as you board the plane, adjust your watch to show the time at your destination, and try to fit into this new time schedule as soon as possible. If it is time to sleep, get your head down. If it is dinner time, eat something. Etc.
- Drink plenty of water, and drink often.
- When it's time to sleep, wear an [eye mask](https://www.amazon.com/OOSilk-19mm-Mulberry-Sleep-Black/dp/B010UCY3S2/ref=lp_12025177011_1_1?srs=12025177011&ie=UTF8&qid=1483885819&sr=8-1) and [earplugs](https://www.amazon.com/gp/product/B0007XJOLG), and take melatonin.
- If you still have trouble sleeping, consider taking [zolpidem](https://en.wikipedia.org/wiki/Zolpidem), [zaleplon](https://en.wikipedia.org/wiki/Zaleplon) or [temazepam](https://en.wikipedia.org/wiki/Temazepam) (the three main sleep aids used in military aviation).
After you fly {#after-you-fly}
- When you arrive at your destination, continue exposing yourself to light in the morning, and limiting exposure to (blue) light in the evening.
- Above all, _don't nap_. If you have trouble staying awake during the day, consider taking modafinil or [dextroamphetamine](https://en.wikipedia.org/wiki/Dextroamphetamine) (the two main stimulants used in military aviation).
Note that, unless you are permanently moving to a new location, you should follow some of the steps above _twice_: first when visiting your destination, and a second time when returning home.
Bibliography {#bibliography}
- {{< cite= "Jedick2020StimulantsSleepAids"=>}}
- {{< cite= "Silver2015BetterWayTo"=>}}
- {{< cite= "Wiseman2014NightSchoolHidden"=>}}]]></description></item><item><title>My Emacs config</title><link>https://stafforini.com/notes/my-emacs-config/</link><pubDate>Sun, 22 Feb 2026 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/my-emacs-config/</guid><description>&lt;![CDATA[I am not a programmer, let alone an Elisp hacker. My background is in the humanities. It is only a slight exaggeration to say that, before I started using Emacs in 2020, I didn't know the difference between a function and a variable. You have been forewarned.
This configuration is an Org document that tangles to an Emacs Lisp init file. It is organized into thematic sections—package management, version control, display, text manipulation, and so on—each of which groups related package declarations together. In addition to built-in features and external packages, it loads dozens of “extras” files: personal Elisp libraries stored in the `emacs/extras/` directory and named after the package they extend (e.g., `org-extras`, `elfeed-extras`, `gptel-extras`). These files are loaded via the `use-personal-package` macro, a thin wrapper around `use-package` that fetches the corresponding file from this dotfiles repository. Such a setup allows me to extend the functionality of various packages and features without cluttering the main configuration section. For example, instead of piling dozens of custom `org-mode` functions into the `org` section of this file, I place them in `emacs/extras/org-extras.el` and load that file with a single `(use-personal-package org-extras)` declaration here. This structure also allows anyone to try out my configuration selectively and straightforwardly. Thus, if you’d like to install my `org` extensions, you can just add one of the following recipes to your own config (depending on which package manager or Emacs version you use):
```emacs-lisp
;; with elpaca
(use-package org-extras
:ensure (:host github :repo "benthamite/dotfiles"
:files ("emacs/extras/org-extras.el"
"emacs/extras/doc/org-extras.texi")))
;; with straight
(use-package org-extras
:straight (:host github :repo "benthamite/dotfiles"
:files ("emacs/extras/org-extras.el"
"emacs/extras/doc/org-extras.texi")))
;; with vc (requires Emacs 30.1 or higher; no Info manual)
(use-package org-extras
:vc (:url "https://github.com/benthamite/dotfiles"
:lisp-dir "emacs/extras"
:rev :newest))
```
The extras come with their own manuals and user options: everything is documented and customizable. When installed via elpaca or straight, each package's Info manual is built and registered automatically. To read a manual, type `M-x info-display-manual RET` (or `C-h R`) and enter the package name (e.g. `org-extras`). You can also browse all available manuals with `M-x info RET` (`C-h i`). (The `:vc` recipe does not currently build Info manuals due to a limitation in `package-vc`.)
early-init {#early-init}
The contents of this code block are tangled to the `early-init.el` file.
First, I check the system appearance and blacken the screen if it's set to \`dark\`. This is done to prevent a flash of white during startup on macOS when using a dark theme. I use frame parameters to set the background and foreground colors instead of `set-face-attribute` to avoid interfering with `face-spec-recalc` during theme switches.
```emacs-lisp
(defun macos-get-system-appearance ()
"Return the current macOS system appearance."
(intern (downcase (string-trim (shell-command-to-string
"defaults read -g AppleInterfaceStyle 2>/dev/null || echo 'Light'")))))
(defun early-init-blacken-screen ()
"Blacken screen as soon as Emacs starts, if the system theme is `dark'.
Use frame parameters instead of `set-face-attribute' to avoid
interfering with `face-spec-recalc' during theme switches."
(when (eq (macos-get-system-appearance) 'dark)
(setopt mode-line-format nil)
(push '(background-color . "#000000") default-frame-alist)
(push '(foreground-color . "#ffffff") default-frame-alist)))
(early-init-blacken-screen)
```
I also disable package initialization at startup (recommended for elpaca) and set `load-prefer-newer` to `t` to ensure that Emacs always loads the latest version of a package (useful during development when packages are frequently updated).
```emacs-lisp
(setq package-enable-at-startup nil)
(setq load-prefer-newer t)
```
Then, I set some frame parameters to remove the title bar and maximize the frame on startup.
```emacs-lisp
(add-to-list 'default-frame-alist '(undecorated-round . t)) ; remove title bar
(add-to-list 'initial-frame-alist '(fullscreen . maximized)) ; maximize frame on startup
```
Finally, I redirect the native compilation cache to a directory within my Emacs profile and define a function for debugging feature loading.
```emacs-lisp
;; github.com/emacscollective/no-littering#native-compilation-cache
(when (fboundp 'startup-redirect-eln-cache)
(startup-redirect-eln-cache
(file-name-concat (getenv "HOME")
".config/emacs-profiles/var/eln-cache/")))
;; for debugging
(defun early-init-trace-feature-load (feature)
"Print a backtrace immediately after FEATURE is loaded."
(eval-after-load feature
`(message "Feature '%s' loaded by:\n%s"
',feature
(with-output-to-string
(backtrace)))))
```
package management {#package-management}
`elpaca` {#elpaca}
_[elpaca](https://github.com/progfolio/elpaca) is a package manager that supports asynchronous installation of packages._
When experiencing issues, [follow these steps](https://github.com/progfolio/elpaca/wiki/Troubleshooting).
- By default, `elpaca` makes shallow copies of all the repos it clones. You can specify the repo depth with the [:depth](https://github.com/progfolio/elpaca/blob/master/doc/manual.md#recipe-keyword-depth) keyword. What if, however, you want to turn a shallow repo into a full repo _after_ it has been cloned? There is a relatively obscure command in Magit that lets you do this: `magit-remote-unshallow`. (Note that this not only passes the `--unshallow` flag but also restores access to all branches in addition to the main one.)
```emacs-lisp
;;; init.el --- Init File -*- lexical-binding: t -*-
(defvar elpaca-installer-version 0.12)
(defvar elpaca-directory (expand-file-name "elpaca/" user-emacs-directory))
(defvar elpaca-builds-directory (expand-file-name "builds/" elpaca-directory))
(defvar elpaca-sources-directory (expand-file-name "sources/" elpaca-directory))
;; Using benthamite fork until progfolio/elpaca#513 is merged (mono-repo deadlock fix).
(defvar elpaca-order '(elpaca :repo "https://github.com/benthamite/elpaca.git"
:ref "6503e6c19931dc42bf16e9af980d2e69921f7b6a" :depth 1 :inherit ignore
:files (:defaults "elpaca-test.el" (:exclude "extensions"))
:build (:not elpaca-activate)))
(let* ((repo (expand-file-name "elpaca/" elpaca-sources-directory))
(build (expand-file-name "elpaca/" elpaca-builds-directory))
(order (cdr elpaca-order))
(default-directory repo))
(add-to-list 'load-path (if (file-exists-p build) build repo))
(unless (file-exists-p repo)
(make-directory repo t)
(when (<= emacs-major-version= 28)= (require= 'subr-x))= (condition-case-unless-debug= err= (if-let*= ((buffer= (pop-to-buffer-same-window= "*elpaca-bootstrap*"))= ((zerop= (apply= #'call-process= `("git"= nil= ,buffer= t= "clone"= ,@(when-let*= ((depth= (plist-get= order= :depth)))= (list= (format= "--depth=%d" depth)= "--no-single-branch"))= ,(plist-get= order= :repo)= ,repo))))= ((zerop= (call-process= "git"= nil= buffer= t= "checkout"= (or= (plist-get= order= :ref)= "--"))))= (emacs= (concat= invocation-directory= invocation-name))= ((zerop= (call-process= emacs= nil= buffer= nil= "-Q"= "-L"= "."= "--batch"= "--eval"= "(byte-recompile-directory= \".\"= 0= 'force)")))= ((require= 'elpaca))= ((elpaca-generate-autoloads= "elpaca"= repo)))= (progn= (message= "%s"= (buffer-string))= (kill-buffer= buffer))= (error= "%s"= (with-current-buffer= buffer= (buffer-string))))= ((error)= (warn= "%s"= err)= (delete-directory= repo= 'recursive))))= (unless= (require= 'elpaca-autoloads= nil= t)= (require= 'elpaca)= (elpaca-generate-autoloads= "elpaca"= repo)= (let= ((load-source-file-function= nil))= (load= "./elpaca-autoloads"))))= (add-hook= 'after-init-hook= #'elpaca-process-queues)= (elpaca= `(,@elpaca-order))= (elpaca-wait)= ;;= NOTE:= Do= not= set= elpaca-queue-limit.= It= counts= *all*= queued= packages= (not= just= ;;= actively= cloning= ones)= as= "active",= causing= a= false= deadlock= on= fresh= installs= ;;= where= hundreds= of= packages= are= enqueued= simultaneously.= See= elpaca--continue-build.= (require= 'elpaca-menu-elpa)= (setf= (alist-get= 'packages-url= (alist-get= 'gnu= elpaca-menu-elpas))= "https://raw.githubusercontent.com/emacsmirror/gnu_elpa/refs/heads/main/elpa-packages"= (alist-get= 'remote= (alist-get= 'gnu= elpaca-menu-elpas))= "https://github.com/emacsmirror/gnu_elpa"= (alist-get= 'packages-url= (alist-get= 'nongnu= elpaca-menu-elpas))= "https://raw.githubusercontent.com/emacsmirror/nongnu_elpa/refs/heads/main/elpa-packages"= (alist-get= 'remote= (alist-get= 'nongnu= elpaca-menu-elpas))= "https://github.com/emacsmirror/nongnu_elpa")= (toggle-debug-on-error)= ;= uncomment= when= debugging= (setq= elpaca-lock-file= (file-name-concat= (file-name-directory= (directory-file-name= elpaca-directory))= "lockfile.el"))= ```= `use-package`= {#use-package}= _[use-package](https://github.com/jwiegley/use-package)= is= a= package= organizer._= ```emacs-lisp= (elpaca= elpaca-use-package= (elpaca-use-package-mode))= (use-package= use-package= :demand= t= :custom= (use-package-always-ensure= t)= (use-package-verbose= t)= (use-package-compute-statistics= t)= (use-package-hook-name-suffix= nil)= ;= use= real= name= for= hooks,= i.e.= do= not= omit= the= `-hook'= bit= (use-package-minimum-reported-time= 0.1)= :config= (defmacro= use-personal-package= (name= &rest= args)= "Like= `use-package'= but= to= load= personal= packages.= NAME= and= ARGS= as= in= `use-package'."= (declare= (indent= defun))= (let= ((name-str= (symbol-name= (eval= `(quote= ,name)))))= `(use-package= ,name= :ensure= (:host= github= :repo= "benthamite/dotfiles"= :files= ,(list= (file-name-concat= "emacs/extras"= (file-name-with-extension= name-str= "el"))= (file-name-concat= "emacs/extras/doc"= (file-name-with-extension= name-str= "texi")))= :depth= nil)= ,@args))))= (elpaca-wait)= ```= Because= `use-personal-package`= declares= each= extras= file= with= an= `:ensure`= recipe= pointing= at= this= GitHub= repository,= `elpaca`= treats= them= like= any= other= package:= it= clones= the= repo= into= its= own= `elpaca/repos/dotfiles/`= directory= and= builds= the= relevant= files= from= there.= This= means= the= dotfiles= end= up= in= two= separate= local= clones—the= primary= one= in= my= main= dotfiles= location= (where= all= edits= are= made)= and= the= elpaca-managed= one= under= `elpaca/repos/dotfiles/`= (which= Emacs= loads= from).= To= keep= them= in= sync,= three= git= hooks= in= the= primary= clone's= gitdir= automatically= propagate= changes= to= the= elpaca= clone.= The= `post-commit`= hook= handles= normal= commits,= the= `post-rewrite`= hook= handles= rebases= and= amends,= and= both= delegate= to= a= shared= `sync-elpaca-clone.sh`= script.= All= three= files= live= in= the= gitdir's= `hooks/`= directory= and= must= be= executable.= The= script= reads= the= active= profile= name= from= a= cache= file= (`~/.config/emacs-profiles/.current-profile`)= that= Emacs= writes= at= startup,= rather= than= calling= `emacsclient`,= to= avoid= deadlocking= when= the= hook= is= triggered= by= a= synchronous= git= process= inside= Emacs= (e.g.,= `magit-commit-squash`).= `hooks/sync-elpaca-clone.sh`:= ```shell= !/bin/sh= Propagate= the= current= state= of= master= to= the= elpaca= dotfiles= clone.= Reads= the= active= profile= from= a= cache= file= written= by= Emacs= at= startup,= avoiding= emacsclient= calls= that= deadlock= when= git= is= spawned= synchronously= by= Emacs.= Called= by= post-commit= and= post-rewrite= hooks.= PROFILE=$(cat "$HOME/.config/emacs-profiles/.current-profile"= 2=>/dev/null)
ELPACA_BASE="$HOME/.config/emacs-profiles/$PROFILE/elpaca"
Newer elpaca versions use sources/ instead of repos/
if [ -d "$ELPACA_BASE/sources/dotfiles/.git" ]; then
ELPACA_DOTFILES="$ELPACA_BASE/sources/dotfiles"
elif [ -d "$ELPACA_BASE/repos/dotfiles/.git" ]; then
ELPACA_DOTFILES="$ELPACA_BASE/repos/dotfiles"
fi
if [ -n "$ELPACA_DOTFILES" ]; then
GITDIR="$GIT_DIR"
The parent git process sets GIT_DIR, GIT_INDEX_FILE, and
GIT_WORK_TREE pointing at the Google Drive clone. Unsetting them
is essential so that git commands target the elpaca clone.
unset GIT_DIR GIT_WORK_TREE GIT_INDEX_FILE
git -C "$ELPACA_DOTFILES" fetch "$GITDIR" master 2>&1 &&
git -C "$ELPACA_DOTFILES" reset --hard FETCH_HEAD 2>&1
fi
```
`hooks/post-commit`:
```shell
!/bin/sh
exec "$(dirname "$0")/sync-elpaca-clone.sh"
```
`hooks/post-rewrite`:
```shell
!/bin/sh
exec "$(dirname "$0")/sync-elpaca-clone.sh"
```
`use-package-extras` {#use-package-extras}
_[use-package-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/use-package-extras.el) collects my extensions for `use-package`._
```emacs-lisp
(use-personal-package use-package-extras
:demand t
:hook
(init-post-init-hook . use-package-extras-display-startup-time))
```
`elpaca-extras` {#elpaca-extras}
_[elpaca-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/elpaca-extras.el) collects my extensions for `elpaca`._
```emacs-lisp
(use-personal-package elpaca-extras
:ensure (:wait t)
:after use-package-extras
:custom
(elpaca-extras-write-lock-file-excluded '(tlon)))
```
foundational {#foundational}
`gcmh` {#gcmh}
_[GCMH](https://github.com/emacsmirror/gcmh) enforces a sneaky Garbage Collection strategy to minimize GC interference with user activity._
```emacs-lisp
(use-package gcmh
:config
(gcmh-mode))
```
`seq` {#seq}
_[seq](https://github.com/emacs-mirror/emacs/blob/master/lisp/emacs-lisp/seq.el) provides sequence-manipulation functions that complement basic functions provided by `subr.el`._
```emacs-lisp
;; https://github.com/progfolio/elpaca/issues/216#issuecomment-1868747372
(defun elpaca-unload-seq (e)
(and (featurep 'seq) (unload-feature 'seq t))
(elpaca--continue-build e))
(use-package seq
:ensure `(seq :build (:before elpaca-activate elpaca-unload-seq)))
```
`paths` {#paths}
_[paths](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/paths.el) defines various paths used in this configuration._
```emacs-lisp
(use-personal-package paths)
```
`transient` {#transient}
_transient is a library for creating keyboard-driven menus._
```emacs-lisp
(use-package transient
:ensure (:host github
:repo "magit/transient"
:branch "main" ; github.com/progfolio/elpaca/issues/342
:build (:not elpaca-check-version))
:after seq
:custom
(transient-default-level 7) ; magit.vc/manual/transient/Enabling-and-Disabling-Suffixes.html
(transient-save-history nil) ; the history file was throwing an error on startup
:bind
(:map transient-base-map
("M-q" . transient-quit-one)))
```
`init` {#init}
_[init](https://github.com/benthamite/init) is a private package that I use to manage my config files and profiles._
```emacs-lisp
(use-package init
:ensure (:host github
:repo "benthamite/init"
:depth nil ; clone entire repo, not just last commit
:wait t)
:after paths
:demand t
:config
(init-startup)
;; Cache the profile name to a file so that git hooks can read it
;; without calling emacsclient (which deadlocks when the hook is
;; spawned by a synchronous git process inside Emacs).
(with-temp-file (file-name-concat
(file-name-directory
(directory-file-name user-emacs-directory))
".current-profile")
(insert init-current-profile))
:bind
("A-n" . init-menu))
```
`no-littering` {#no-littering}
_[no-littering](https://github.com/emacscollective/no-littering) keeps `.emacs.d` clean._
```emacs-lisp
(use-package no-littering
:ensure (:wait t)
:demand t
:init
;; these directories should be shared across profiles, so there should
;; be only one `var' and one `etc' directory in `emacs-profiles'
;; rather than a pair of such directories for each profile
(setq no-littering-etc-directory (file-name-concat paths-dir-emacs-profiles "etc/"))
(setq no-littering-var-directory (file-name-concat paths-dir-emacs-profiles "var/"))
:config
;; github.com/emacscollective/no-littering#auto-save-settings
;; should not be set via :custom
(setq auto-save-file-name-transforms
`((".*" ,(no-littering-expand-var-file-name "auto-save/") t))))
```
`ns-win` {#ns-win}
_ns-win provides various Nexstep convenience functions._
```emacs-lisp
(use-feature ns-win
:custom
(mac-option-modifier 'meta)
(mac-control-modifier 'control)
(mac-command-modifier 'hyper)
(mac-function-modifier 'none)
(mac-right-option-modifier 'none)
(mac-right-control-modifier 'super)
(mac-right-command-modifier 'alt)
;; ns-use-proxy-icon set to t causes Emacs to freeze
(ns-use-proxy-icon nil))
```
`iso-transl` {#iso-transl}
_iso-transl defines ways of entering the non-ASCII printable characters with codes above 127._
```emacs-lisp
(use-feature iso-transl
:config
(setq iso-transl-char-map nil) ; emacs.stackexchange.com/questions/17508/
;; unset all `Super' key bindings
(dolist (char (number-sequence ?a ?z))
(keymap-global-unset (concat "s-" (char-to-string char))))
;; unset some `Alt' key bindings in `key-translation-map'
(dolist (char '("SPC" "!" "$" "+" "-" "<" "=>" "?" "a" "c" "m" "o" "u"
"x" "C" "L" "P" "R" "S" "T" "Y" "[" "]" "{" "|" "}"))
(keymap-unset key-translation-map (concat "A-" char))))
```
`el-patch` {#el-patch}
_[el-patch](https://github.com/raxod502/el-patch) customizes the behavior of Emacs Lisp functions and notifies the user when a function so customized changes._
```emacs-lisp
(use-package el-patch)
```
`casual` {#casual}
_[casual](https://github.com/kickingvegas/casual) is a collection of Transient menus for various Emacs modes._
```emacs-lisp
(use-package casual
:defer t
:init
(with-eval-after-load 'calc-mode
(bind-keys :map calc-mode-map
("C-o" . casual-calc-tmenu)
:map calc-alg-map
("C-o" . casual-calc-tmenu))))
```
`warnings` {#warnings}
_warnings provides support for logging and displaying warnings._
```emacs-lisp
(use-feature warnings
:custom
(warning-suppress-types '((copilot copilot-exceeds-max-char)
(flycheck syntax-checker)
(org-roam)
(tramp)
(aidermacs)
(org-element-cache)
(yasnippet backquote-change))))
```
`comp` {#comp}
_comp compiles Lisp code into native code._
```emacs-lisp
(use-feature comp
:custom
(native-comp-async-report-warnings-errors nil))
```
`bytecomp` {#bytecomp}
_bytecomp compiles Lisp code into byte code._
```emacs-lisp
(use-feature bytecomp
:custom
(byte-compile-warnings '(cl-functions)))
```
`startup` {#startup}
_[startup](https://github.com/emacs-mirror/emacs/blob/master/lisp/startup.el) processes Emacs shell arguments and controls startup behavior._
```emacs-lisp
(use-feature emacs
:custom
(user-full-name "Pablo Stafforini")
(user-mail-address (getenv "PERSONAL_GMAIL"))
(initial-scratch-message nil)
(inhibit-startup-screen t)
(inhibit-startup-echo-area-message user-login-name)
(inhibit-startup-buffer-menu t)
(frame-resize-pixelwise t))
```
`server` {#server}
_server starts a server for external clients to connect to._
```emacs-lisp
(use-feature server
:demand t
:config
(unless (server-running-p)
(server-start)))
```
`async` {#async}
_[async](https://github.com/jwiegley/emacs-async) is a simple library for asynchronous processing in Emacs._
```emacs-lisp
(use-package async
:defer t)
```
`prot-common` {#prot-common}
_[[<https://github.com/protesilaos/dotfiles/blob/master/emacs>_.emacs.d/prot-lisp/prot-common.el][prot-common]] is a set of functions used by Protesilaos Stavrou's unreleased "packages"._
Note Prot's clarification:
Remember that every piece of Elisp that I write is for my own educational and recreational purposes. I am not a programmer and I do not recommend that you copy any of this if you are not certain of what it does.
```emacs-lisp
(use-package prot-common
:ensure (:host github
:repo "protesilaos/dotfiles"
:local-repo "prot-common"
:main "emacs/.emacs.d/prot-lisp/prot-common.el"
:build (:not elpaca-check-version)
:files ("emacs/.emacs.d/prot-lisp/prot-common.el")))
```
`prot-simple` {#prot-simple}
_[[<https://github.com/protesilaos/dotfiles/blob/master/emacs>_.emacs.d/prot-lisp/prot-simple.el][prot-simple]] is a set of common commands used by Protesilaos Stavrou's unreleased "packages"._
Note Prot's clarification:
Remember that every piece of Elisp that I write is for my own educational and recreational purposes. I am not a programmer and I do not recommend that you copy any of this if you are not certain of what it does.
```emacs-lisp
(use-package prot-simple
:ensure (:host github
:repo "protesilaos/dotfiles"
:local-repo "prot-simple"
:main "emacs/.emacs.d/prot-lisp/prot-simple.el"
:build (:not elpaca-check-version)
:files ("emacs/.emacs.d/prot-lisp/prot-simple.el"))
:after prot-common
:custom
(prot-simple-date-specifier "%F")
(prot-simple-time-specifier "%R %z")
:bind
(("M-s-=" . prot-simple-insert-date)
("A-C-H-j" . prot-simple-mark-sexp)))
```
`bug-hunter` {#bug-hunter}
_[bug-hunter](https://elpa.gnu.org/packages/bug-hunter.html) interactively bisects and debugs your init file._
```emacs-lisp
(use-package bug-hunter
:defer t)
```
`inheritenv` {#inheritenv}
_[inheritenv](https://github.com/purcell/inheritenv) allows temp buffers to inherit buffer-local environment variables._
```emacs-lisp
(use-package inheritenv
:ensure (:host github :repo "purcell/inheritenv")
:defer t)
```
`misc` {#misc}
_Miscellaneous settings: default directory, short answers, message log, bell, cursor width, and UTF-8 encoding._
```emacs-lisp
(use-feature emacs
:custom
(default-directory paths-dir-dropbox)
(use-short-answers t)
(message-log-max t)
(ring-bell-function 'ignore) ; silence bell when mistake is made
(x-stretch-cursor t) ; make curor the width of the character under it
;; emacs.stackexchange.com/questions/14509/kill-process-buffer-without-confirmation
;; UTF8 stuff.
:init
(prefer-coding-system 'utf-8)
(set-default-coding-systems 'utf-8)
(set-terminal-coding-system 'utf-8)
(set-keyboard-coding-system 'utf-8)
:bind
(:map input-decode-map
("M-8" . "•")))
```
performance {#performance}
`profiler` {#profiler}
_[profiler](https://github.com/emacs-mirror/emacs/blob/master/lisp/profiler.el) provides UI and helper functions for the Emacs profiler._
```emacs-lisp
(use-feature profiler
:defer t)
```
`profiler-extras` {#profiler-extras}
_[profiler-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/profiler-extras.el) collects my extensions for `profiler`._
```emacs-lisp
(use-personal-package profiler-extras
:bind
(("A-H-p" . profiler-extras-profiler-toggle)
:map profiler-report-mode-map
("<backtab>" . profiler-extras-profiler-report-toggle-entry-global)))
```
`so-long` {#so-long}
_[so-long](https://savannah.nongnu.org/projects/so-long) optimizes performance with minified code._
```emacs-lisp
(use-feature so-long
:custom
(so-long-threshold 500000)
:hook
(find-file-hook . global-so-long-mode))
```
`misc.` {#misc-dot}
_Performance-related settings: bidirectional display, font caches, fontification skipping, and process output buffer size._
Partly borrowed from [Prot](https://gitlab.com/protesilaos/dotfiles/-/blob/350ca3144c5ee868056619b9d6351fca0d6b131e/emacs/.emacs.d/emacs-init.org).
```emacs-lisp
(use-feature emacs
:custom
(bidi-display-reordering nil)
(bidi-inhibit-bpa t)
(inhibit-compacting-font-caches t)
(redisplay-skip-fontification-on-input t)
;; emacs-lsp.github.io/lsp-mode/page/performance/
(read-process-output-max (expt 1024 2))
(bidi-paragraph-direction 'left-to-right))
```
secrets {#secrets}
:LOGBOOK:
nil:END:
`plstore` {#plstore}
_plstore is a plist based data store providing search and partial encryption._
This feature is required by `org-gcal`. We create a new GPG key to use with `org-gcal` and add its public ID to `plstore-encrypt-to` , following [these instructions](https://github.com/kidd/org-gcal.el#note). (This method is superior to using symmetric encryption because it does not prompt the user for authentication with every new Emacs session.)
```emacs-lisp
(use-feature plstore
:after pass
:config
(add-to-list 'plstore-encrypt-to "A7C6A908CD1254A8B4051D3DCDBBB523C9627A26"))
```
`epg-config` {#epg-config}
_epg-config provides configuration for the Easy Privacy Guard library._
```emacs-lisp
(use-feature epg-config
:custom
(epg-pinentry-mode 'loopback) ; use minibuffer for password entry
(epg-gpg-program "/opt/homebrew/bin/gpg"))
```
`auth-source` {#auth-source}
_auth-source supports authentication sources for Gnus and Emacs._
```emacs-lisp
(use-feature auth-source
:preface
(eval-when-compile
(defvar auth-sources))
:custom
(auth-source-debug nil) ; set to t for debugging
(auth-source-do-cache t)
(auth-sources '(macos-keychain-internet macos-keychain-generic)))
```
`oauth2-auto` {#oauth2-auto}
_[emacs-oauth2-auto](https://github.com/telotortium/emacs-oauth2-auto) supports authentication to an OAuth2 provider from within Emacs._
```emacs-lisp
(use-package oauth2-auto
:ensure (:host github
:repo "telotortium/emacs-oauth2-auto"
:protocol ssh)
:after org-gcal
:custom
(oauth2-auto-plstore (no-littering-expand-var-file-name "oauth2-auto.plist")))
```
`pass` {#pass}
_[pass](https://github.com/NicolasPetton/pass) is a major mode for [pass](https://en.wikipedia.org/wiki/Pass_(software)), the standard Unix password manager_
```emacs-lisp
(use-package pass
:custom
(pass-suppress-confirmations t)
(pass-show-keybindings nil)
:config
(run-with-timer (* 5 60) t (lambda () (magit-extras-warn-if-repo-is-dirty paths-dir-dropbox-tlon-pass)))
:bind
(("A-H-o" . pass)
:map pass-mode-map
("RET" . pass-edit)
("c" . pass-copy)
("D" . pass-kill)
:map pass-view-mode-map
("s-p" . pass-view-toggle-password)
("H-q" . pass-quit)
("s-s" . server-edit)))
```
`pass-extras` {#pass-extras}
_[pass-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/pass-extras.el) collects my extensions for `pass`._
```emacs-lisp
(use-personal-package pass-extras
:bind
(:map pass-mode-map
("SPC" . pass-extras-open-at-point)
("e" . pass-extras-edit)
("G" . pass-extras-generate-password)
("I" . pass-extras-insert-generated-no-symbols)))
```
`password-store-otp` {#password-store-otp}
_[password-store-otp](https://github.com/volrath/password-store-otp.el) provides integration with the pass-otp extension for pass._
```emacs-lisp
(use-package password-store-otp
:ensure (:version (lambda (_) "0.1.5")) ; github.com/progfolio/elpaca/issues/229
:after pass)
```
`auth-source-pass` {#auth-source-pass}
_auth-source-pass integrates auth-source with password-store._
```emacs-lisp
(use-feature auth-source-pass
:demand t
:after auth-source pass
:config
(auth-source-pass-enable)
:hook
(doom-modeline-before-github-fetch-notification-hook . auth-source-pass-enable))
```
`password-generator` {#password-generator}
_[password-generator](https://github.com/vandrlexay/emacs-password-genarator) [sic] generates various types of passwords._
```emacs-lisp
(use-package password-generator
:ensure (:host github
:repo "vandrlexay/emacs-password-genarator") ; sic
:defer t)
```
version control {#version-control}
:LOGBOOK:
nil:END:
`vc` {#vc}
_vc provides support for various version control systems._
```emacs-lisp
(use-feature vc
:defer t
:custom
(vc-handled-backends '(Git))
(vc-follow-symlinks t) ; don't ask for confirmation when opening symlinked file
(vc-make-backup-files nil) ; do not backup version controlled files
;; Disable VC in Dropbox cloud storage directories. `vc-git' runs
;; synchronous `git' subprocesses via `call-process', which can hang
;; indefinitely when Dropbox's virtual filesystem stalls on I/O
;; (e.g. smart sync resolving a cloud-only file). This blocks the
;; main thread since there is no timeout on the read.
(vc-ignore-dir-regexp
(format "%s\\|%s\\|%s"
vc-ignore-dir-regexp
"Library/CloudStorage/Dropbox/"
"My Drive/")))
```
`vc-extras` {#vc-extras}
_[vc-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/vc-extras.el) collects my extensions for `vc`._
```emacs-lisp
(use-personal-package vc-extras
:after vc
:custom
(vc-extras-split-repo t)
:bind
("A-v" . vc-extras-menu))
```
`log-edit` {#log-edit}
_log-edit is a major mode for editing CVS commit messages._
```emacs-lisp
(use-feature log-edit
:defer t
:config
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'log-edit-comment-ring)))
```
`diff-mode` {#diff-mode}
_diff-mode is a mode for viewing and editing context diffs._
```emacs-lisp
(use-feature diff-mode
:defer t
:bind
(:map diff-mode-map
("M-o" . nil)))
```
`ediff` {#ediff}
_[ediff](https://github.com/emacs-mirror/emacs/blob/master/lisp/vc/ediff.el) is a comprehensive visual interface to diff and patch._
```emacs-lisp
(use-feature ediff
:custom
(ediff-window-setup-function 'ediff-setup-windows-plain)
(ediff-split-window-function 'split-window-horizontally)
:config
(defun ediff-toggle-word-mode ()
"Toggle between linewise and wordwise comparisons."
(interactive)
(setq ediff-word-mode (not ediff-word-mode))
(message "Word mode %s"
(if ediff-word-mode "disabled" "enabled"))
(ediff-update-diffs))
:bind
(("A-d" . ediff)))
```
`ediff-extras` {#ediff-extras}
_[ediff-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/ediff-extras.el) collects my extensions for `ediff`._
```emacs-lisp
(use-personal-package ediff-extras
:after ediff
:demand t)
```
`smerge` {#smerge}
_[smerge-mode](https://github.com/emacs-mirror/emacs/blob/master/lisp/vc/smerge-mode.el) is a minor mode for resolving diff3 conflicts._
```emacs-lisp
(use-feature smerge-mode
:bind
(:map smerge-mode-map
("s-n" . smerge-next)
("s-SPC" . smerge-next)
("s-p" . smerge-prev)
("s-l" . smerge-keep-lower)
("s-k" . smerge-keep-upper)
("s-a" . smerge-keep-all)
("s-b" . smerge-keep-base)
("s-c" . smerge-keep-current)))
```
`gh` {#gh}
_[gh](https://github.com/sigma/gh.el) is a GitHub API library for Emacs._
```emacs-lisp
(use-package gh
:ensure (:version (lambda (_) "2.29"))
:defer t) ; github.com/progfolio/elpaca/issues/229
```
`closql` {#closql}
_[closql](https://github.com/magit/closql) stores EIEIO objects using EmacSQL._
```emacs-lisp
(use-package closql
:ensure (:host github
:repo "magit/closql")
:defer t)
```
`magit` {#magit}
_[magit](https://github.com/magit/magit) is a complete text-based user interface to Git._
```emacs-lisp
(use-package magit
:ensure (:host github
:repo "magit/magit"
:branch "main"
:build (:not elpaca-check-version))
:custom
(magit-commit-ask-to-stage 'stage)
(magit-clone-set-remote.pushDefault t)
(magit-diff-refine-hunk 'all) ; show word-granularity differences in all diff hunks
:config
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'magit-read-rev-history))
(add-to-list 'magit-no-confirm 'stage-all-changes)
:hook
((magit-status-mode-hook magit-diff-mode-hook) .
(lambda ()
"Disable line truncation in Magit buffers."
(setq truncate-lines nil)))
:bind
(("A-g" . magit)
("A-M-g" . magit-clone)
:map magit-log-mode-map
("k" . magit-section-backward-sibling)
("l" . magit-section-forward-sibling)
:map magit-mode-map
("p" . magit-pull)
("." . magit-push)
:map magit-diff-mode-map
("A-C-s-r" . magit-section-backward-sibling)
("A-C-s-f" . magit-section-forward-sibling)
:map magit-hunk-section-map
("s-l" . magit-smerge-keep-lower)
("s-k" . magit-smerge-keep-upper)
("s-a" . magit-smerge-keep-all)
("s-b" . magit-smerge-keep-base)
("s-c" . magit-smerge-keep-current)
:map magit-hunk-section-smerge-map
("s-l" . magit-smerge-keep-lower)
("s-k" . magit-smerge-keep-upper)
("s-a" . magit-smerge-keep-all)
("s-b" . magit-smerge-keep-base)
("s-c" . magit-smerge-keep-current)
:map magit-status-mode-map
("s-l" . magit-smerge-keep-lower)
("s-k" . magit-smerge-keep-upper)
("s-a" . magit-smerge-keep-all)
("s-b" . magit-smerge-keep-base)
("s-c" . magit-smerge-keep-current)
("s-r" . tlon-commit-when-slug-at-point)
("s-u" . magit-remote-unshallow)
("A-C-s-r" . magit-section-backward-sibling)
("A-C-s-f" . magit-section-forward-sibling)
:map magit-revision-mode-map
("A-C-s-r" . magit-section-backward-sibling)
("A-C-s-f" . magit-section-forward-sibling)))
```
- [EMACSPEAK The Complete Audio Desktop: GitHub Standard Fork And Pull-Request Workflow From Emacs](https://emacspeak.blogspot.com/2020/05/github-standard-fork-and-pull-request.html)
- To read: [Super Keybindings for Magit | Emacs Redux](https://emacsredux.com/blog/2020/12/11/super-keybindings-for-magit/)
`magit-extra` {#magit-extra}
_[magit-extra](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/magit-extra.el) collects my extensions for `magit`._
Note that this is called `magit-extra` (with no ‘s’ at the end) because Magit already provides a feature called `magit-extras`.
```emacs-lisp
(use-personal-package magit-extra
:after magit
:demand t
:hook
(git-commit-setup-hook . magit-extras-move-point-to-start)
:bind
("s-p" . magit-extras-with-editor-finish-and-push))
```
`magit-todos` {#magit-todos}
_[magit-todos](https://github.com/alphapapa/magit-todos) displays TODOs present in project files in the Magit status buffer._
```emacs-lisp
(use-package magit-todos
:ensure (:host github
:repo "alphapapa/magit-todos"
:build (:not elpaca-check-version))
:after magit hl-todo
:custom
(magit-todos-branch-list nil)
:config
(magit-todos-mode))
```
`with-editor` {#with-editor}
_[with-editor](https://github.com/magit/with-editor) allows the use of Emacsclient as the $EDITOR for external programs._
```emacs-lisp
(use-package with-editor
:bind (("s-c" . with-editor-finish)
("s-k" . with-editor-cancel)
("C-c C-c" . with-editor-finish)))
```
`ghub` {#ghub}
_[ghub](https://github.com/magit/ghub) provides basic support for using the APIs of various Git forges from Emacs packages._
```emacs-lisp
(use-package ghub
:ensure (:host github
:build (:not elpaca-check-version)
:repo "magit/ghub"
:branch "main")
:defer t
:config
(require 'pass))
```
`forge` {#forge}
_[forge](https://github.com/magit/forge) let's one work with git forges directly from Magit._
```emacs-lisp
(use-package forge
:ensure (:host github
:repo "magit/forge"
:branch "main" ; github.com/progfolio/elpaca/issues/342
:build (:not elpaca-check-version))
:after magit ghub emacsql auth-source-pass
:init
(with-eval-after-load 'magit-status
(bind-keys :map 'magit-status-mode-map
("s-a" . forge-topic-set-assignees)
("s-d" . forge-delete-comment)
("s-e" . forge-edit-post)
("s-i" . forge-browse-issue)
("s-I" . forge-browse-issues)
("s-l" . forge-topic-set-labels)
("s-o" . forge-topic-status-set-done)
("s-p" . forge-create-post)
("s-r" . forge-create-post)
("s-t" . forge-topic-set-title)))
(with-eval-after-load 'magit
(bind-keys :map 'magit-mode-map
("n" . forge-dispatch)))
:custom
(forge-owned-accounts '(("benthamite")))
(forge-topic-list-limit '(500 . -500)) ; show closed topics only via `forge-toggle-closed-visibility'
;; do not show inactive topics by default; keep other settings unchanged
(forge-status-buffer-default-topic-filters
(forge--topics-spec :type 'topic :active nil :state 'open :order 'newest))
:config
;; why is this turned on by default!?
(remove-hook 'forge-post-mode-hook 'turn-on-flyspell)
;; temporarily overwrite function until idiotic error message is removed
(defun forge--ghub-massage-notification (data githost)
(let-alist data
(let* ((type (intern (downcase .subject.type)))
(type (if (eq type 'pullrequest) 'pullreq type))
(_ (unless (memq type '( discussion issue pullreq
commit release checksuite)) ; Added checksuite
(message "Forge: Ignoring unknown notification type: %s" type))) ; Changed error to message
(number-or-commit (and .subject.url
(string-match "[^/]*\\'" .subject.url)
(match-string 0 .subject.url)))
(number (and (memq type '(discussion issue pullreq))
(string-to-number number-or-commit)))
(repo (forge-get-repository
(list githost
.repository.owner.login
.repository.name)
nil :insert!))
(repoid (oref repo id))
(owner (oref repo owner))
(name (oref repo name))
(id (forge--object-id repoid (string-to-number .id)))
(alias (intern (concat "_" (string-replace "=" "_" id)))))
(and number
(list alias id
`((,alias repository)
[(name ,name)
(owner ,owner)]
,@(cddr
(caddr
(ghub--graphql-prepare-query
ghub-fetch-repository
(pcase type
('discussion `(repository
discussions
(discussion . ,number)))
('issue `(repository
issues
(issue . ,number)))
('pullreq `(repository
pullRequest
(pullRequest . ,number))))))))
repo type data)))))
:hook
(forge-issue-mode-hook . simple-extras-visual-line-mode-enhanced)
:bind
(:map forge-post-mode-map
("s-c" . forge-post-submit)
:map forge-issue-mode-map
("s-a" . forge-topic-set-assignees)
("s-d" . forge-delete-comment)
("s-e" . forge-edit-post)
("s-i" . forge-browse-issue)
("s-I" . forge-browse-issues)
("s-l" . forge-topic-set-labels)
("s-o" . forge-topic-status-set-done)
("s-p" . forge-create-post)
("s-r" . forge-create-post)
("s-t" . forge-topic-set-title)
:map forge-notifications-mode-map
("s-a" . forge-topic-set-assignees)
("s-d" . forge-delete-comment)
("s-e" . forge-edit-post)
("s-i" . forge-browse-issue)
("s-I" . forge-browse-issues)
("s-l" . forge-topic-set-labels)
("s-o" . forge-topic-status-set-done)
("s-p" . forge-create-post)
("s-r" . forge-create-post)
("s-t" . forge-topic-set-title)
:map forge-topic-mode-map
("s-a" . forge-topic-set-assignees)
("s-d" . forge-delete-comment)
("s-e" . forge-edit-post)
("s-i" . forge-browse-issue)
("s-I" . forge-browse-issues)
("s-l" . forge-topic-set-labels)
("s-o" . forge-topic-status-set-done)
("s-p" . forge-create-post)
("s-r" . forge-create-post)
("s-t" . forge-topic-set-title)))
```
`orgit` {#orgit}
_[orgit](https://github.com/magit/orgit) provides support for Org links to Magit buffers._
```emacs-lisp
(use-package orgit
:ensure (:build (:not elpaca-check-version))
:defer t)
```
`orgit-forge` {#orgit-forge}
_[orgit-forge](https://github.com/magit/orgit-forge) supports `org-mode` links to `forge` buffers._
```emacs-lisp
(use-package orgit-forge
:after org forge
:ensure (:build (:not elpaca-check-version)))
```
`forge-search` {#forge-search}
_[forge-search](https://github.com/eatse21/forge-search.el/blob/master/forge-search.el) supports searching through issues and pull requests within `forge`._
```emacs-lisp
(use-package forge-search
:ensure (:host github
:repo "benthamite/forge-search.el"
:branch "fix/forge-get-repository")
:after forge
:bind
(:map forge-search-mode-map
("A-C-s-r" . magit-section-backward-sibling)
("A-C-s-f" . magit-section-forward-sibling)))
```
`forge-extras` {#forge-extras}
_[forge-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/forge-extras.el) collects my extensions for `forge`._
```emacs-lisp
(use-personal-package forge-extras
:after forge
:demand t
:init
(with-eval-after-load 'magit-status
(bind-keys :map 'magit-status-mode-map
("s-x" . forge-extras-state-set-dwim)))
:custom
(forge-extras-project-owner "tlon-team")
(forge-extras-project-number 9)
(forge-extras-project-node-id "PVT_kwDOBtGWf84A5jZf")
(forge-extras-status-field-node-id "PVTSSF_lADOBtGWf84A5jZfzguVNY8")
(forge-extras-estimate-field-node-id "PVTF_lADOBtGWf84A5jZfzguVNc0")
(forge-extras-status-option-ids-alist
'(("Doing" . "47fc9ee4")
("Next" . "8607328f")
("Later" . "13e22f63")
("Someday" . "4bf0f00e")
("Waiting" . "28097d1b")
("Done" . "98236657")))
:config
(advice-add 'orgit-store-link :override #'forge-extras-orgit-store-link)
(advice-add 'forge-visit-this-topic :before #'forge-extras-sync-read-status)
(run-with-idle-timer 30 t #'forge-extras-pull-notifications)
:bind
(:map forge-issue-mode-map
("A-C-s-d" . forge-previous-message)
("A-C-s-f" . forge-next-message)
("s-s" . forge-extras-set-project-status)
("s-w" . forge-extras-copy-message-at-point-as-kill))
(:map forge-notifications-mode-map
("x" . forge-extras-browse-github-inbox)
("s-x" . forge-extras-state-set-dwim))
(:map forge-topic-mode-map
("s-x" . forge-extras-state-set-dwim)))
```
`emacs-pr-review` {#emacs-pr-review}
_[emacs-pr-review](https://github.com/blahgeek/emacs-pr-review) provides support for reviewing pull requests in Emacs._
See [this config](https://gitlab.com/magus/mes/-/blob/8615353ec007bd66209ee1ae3badddd26d3a3dc9/lisp/mes-dev-basics.el#L76) for ideas.
```emacs-lisp
(use-package pr-review
:after forge)
```
`git-auto-commit-mode` {#git-auto-commit-mode}
_[git-auto-commit-mode](https://github.com/ryuslash/git-auto-commit-mode) allows for committing and pushing automatically after each save._
```emacs-lisp
(use-package git-auto-commit-mode
:after recentf
:config
(setq-default gac-automatically-push-p nil)
(setq-default gac-debounce-interval 30)
(setq-default gac-silent-message-p t)
(setq-default gac-automatically-add-new-files-p t))
```
display {#display}
```emacs-lisp
(setq-default line-spacing 2)
```
`fringe` {#fringe}
_[fringe](https://github.com/emacs-mirror/emacs/blob/master/lisp/fringe.el) controls the thin strips at the edges of windows used for indicators._
```emacs-lisp
(use-feature fringe
:config
(setq-default fringe-indicator-alist
'((truncation nil nil)
(continuation nil nil)
(overlay-arrow . right-triangle)
(up . up-arrow)
(down . down-arrow)
(top top-left-angle top-right-angle)
(bottom bottom-left-angle bottom-right-angle top-right-angle top-left-angle)
(top-bottom left-bracket right-bracket top-right-angle top-left-angle)
(empty-line . empty-line)
(unknown . question-mark))))
```
`faces` {#faces}
_[faces](https://github.com/emacs-mirror/emacs/blob/master/lisp/faces.el) provides face definition and manipulation._
```emacs-lisp
(use-feature faces
:config
(setq ns-use-thin-smoothing t)
;; to prevent misalignment in vtable
(set-face-attribute 'header-line nil :box nil))
```
- [An Annotated Spacemacs - For an org-mode workflow ·](https://out-of-cheese-error.netlify.app/spacemacs-config)
- [- zzamboni.org | Beautifying Org Mode in Emacs](https://zzamboni.org/post/beautifying-org-mode-in-emacs/)
`faces-extras` {#faces-extras}
_[faces-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/faces-extras.el) collects my extensions for `faces`._
```emacs-lisp
(use-personal-package faces-extras
:demand t
:config
(faces-extras-set-and-store-face-attributes
'((default :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)
(fixed-pitch :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-height)
(variable-pitch :family faces-extras-variable-pitch-font :height faces-extras-variable-pitch-height)
(window-divider :foreground (face-attribute 'mode-line-inactive :background))))
:hook
(init-post-init-hook . faces-extras-set-custom-face-attributes)
:bind
("C-h C-f" . faces-extras-describe-face))
```
`org-modern` {#org-modern}
_[org-modern](https://github.com/minad/org-modern) prettifies org mode._
```emacs-lisp
(use-package org-modern
:after org faces-extras
:custom
(org-modern-table nil) ; doesn’t work well with variable-pitch: github.com/minad/org-modern/issues/99
(org-modern-statistics nil)
(org-modern-star 'fold)
(org-modern-fold-stars
'(("▸" . "▾")
("▸" . "▾")
("▸" . "▾")
("▸" . "▾")
("▸" . "▾")))
(org-modern-replace-stars '("◉" "◉" "◉" "◉" "◉"))
(org-modern-list
'((42 . "○")
(43 . "○")
(45 . "○")))
:config
(faces-extras-set-and-store-face-attributes
'((org-modern-date-active :family faces-extras-fixed-pitch-font :height faces-extras-org-date-height)
(org-modern-date-inactive :family faces-extras-fixed-pitch-font :height faces-extras-org-date-height)
(org-modern-tag :family faces-extras-fixed-pitch-font :height faces-extras-org-tag-height)
(org-modern-label :family faces-extras-fixed-pitch-font :height faces-extras-org-tag-height)))
(global-org-modern-mode))
```
`org-modern-indent` {#org-modern-indent}
_[org-modern-indent](https://github.com/jdtsmith/org-modern-indent) extends org-modern stylistic improvements to contexts involving indentation._
```emacs-lisp
(use-package org-modern-indent
:ensure (:host github
:repo "jdtsmith/org-modern-indent")
:after org-modern
:hook org-mode-hook
:config
;; Remove the ╭│╰ bracket decoration; it renders with gaps in
;; variable-pitch buffers because the line height exceeds the glyph height.
(setq org-modern-indent-begin " "
org-modern-indent-guide " "
org-modern-indent-end " "))
```
`org-indent-pixel` {#org-indent-pixel}
_[org-indent-pixel](https://github.com/benthamite/org-indent-pixel) fixes misaligned wrapped lines in `variable-pitch-mode`._
```emacs-lisp
(use-package org-indent-pixel
:ensure (:host github :repo "benthamite/org-indent-pixel")
:after org-indent
:init
(org-indent-pixel-setup))
```
`org-tidy` {#org-tidy}
_[org-tidy](https://github.com/jxq0/org-tidy) hides org-mode property drawers._
```emacs-lisp
(use-package org-tidy
:after org
:custom
(org-tidy-properties-inline-symbol "")
(org-tidy-protect-overlay nil) ; github.com/jxq0/org-tidy/issues/11
:hook org-mode-hook)
```
`org-appear` {#org-appear}
_[org-appear](https://github.com/awth13/org-appear) toggles the visibility of hidden org mode element parts upon entering and leaving those elements._
```emacs-lisp
(use-package org-appear
:after org
:hook org-mode-hook)
```
`face-remap` {#face-remap}
_[face-remap](https://github.com/emacs-mirror/emacs/blob/master/lisp/face-remap.el) defines simple operations for face remapping._
```emacs-lisp
(use-feature face-remap
:after eww
:hook
((elfeed-show-mode-hook
telega-webpage-mode-hook
eww-mode-hook
mu4e-view-mode-hook
outline-mode-hook) . variable-pitch-mode)
:bind
(:map eww-mode-map
("+" . text-scale-increase)
("-" . text-scale-decrease)))
```
`modus-themes` {#modus-themes}
_[modus-themes](https://protesilaos.com/emacs/modus-themes) are a pair of accessible white/dark themes for Emacs._
```emacs-lisp
(use-package modus-themes
:ensure (:host github
:repo "protesilaos/modus-themes")
:after faces faces-extras simple-extras
:demand t
:custom
(modus-themes-mixed-fonts t)
:config
(setopt modus-themes-common-palette-overrides
`((fringe unspecified) ; hide the fringe
(bg-prose-block-delimiter bg-inactive)
(fg-prose-block-delimiter gray)
;; for the rest, use the predefined intense values
,@modus-themes-preset-overrides-intense))
:hook
(modus-themes-after-load-theme-hook . faces-extras-set-custom-face-attributes)
(modus-themes-after-load-theme-hook . frame-extras-restore-window-divider))
```
`modus-themes-extras` {#modus-themes-extras}
_[modus-themes-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/modus-themes-extras.el) collects my extensions for `modus-themes`._
```emacs-lisp
(use-personal-package modus-themes-extras
:after modus-themes
:demand t
:custom
(modus-themes-extras-light-theme 'modus-operandi-tinted)
(modus-themes-extras-dark-theme 'modus-vivendi-tinted)
:config
(init-override-code
:modus-themes-load
'((modus-themes-extras-load-theme-conditionally)))
:hook
(modus-themes-after-load-theme-hook . modus-themes-extras-highlight-parentheses)
(modus-themes-after-load-theme-hook . modus-themes-extras-set-faces)
:bind
("A-u" . modus-themes-extras-toggle))
```
`highlight-parentheses` {#highlight-parentheses}
_[highlight-parentheses](https://sr.ht/~tsdh/highlight-parentheses.el/) dynamically highlights the parentheses surrounding point based on nesting-level using configurable lists of colors, background colors, and other properties._
```emacs-lisp
(use-package highlight-parentheses
:custom
(highlight-parentheses-delay 0)
:config
(global-highlight-parentheses-mode)
:hook
(minibuffer-setup-hook . highlight-parentheses-minibuffer-setup))
```
`spacious-padding` {#spacious-padding}
_[spacious-padding](https://git.sr.ht/~protesilaos/spacious-padding) increases the spacing of frames and windows._
```emacs-lisp
(use-package spacious-padding
:ensure (:tag "0.3.0") ; using tagged version to avoid error on 2024-02-21
:custom
(spacious-padding-widths '())
:config
(spacious-padding-mode))
```
`emoji` {#emoji}
_emoji provides commands for emoji insertion._
```emacs-lisp
(use-feature emoji
:bind
("H-:" . emoji-search))
```
`color` {#color}
_color is a color manipulation library._
```emacs-lisp
(use-feature color)
```
`color-extras` {#color-extras}
_[color-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/color-extras.el) collects my extensions for `color`._
Note that the loading of `color` cannot be deferred, since it is required by `pulse`. So we defer-load this package.
```emacs-lisp
(use-personal-package color-extras
:after color
:defer 30)
```
`rainbow-mode` {#rainbow-mode}
_[rainbow-mode](https://elpa.gnu.org/packages/rainbow-mode.html) colorizes strings that match color names._
```emacs-lisp
(use-package rainbow-mode
:after color-extras
:custom
(rainbow-ansi-colors nil)
(rainbow-x-colors nil))
```
`ct` {#ct}
_[ct](https://github.com/neeasade/ct.el) is color library meant for making changes to individual colors in various color spaces._
```emacs-lisp
(use-package ct
:after color-extras)
```
`hsluv` {#hsluv}
_[hsluv](https://github.com/hsluv/hsluv-emacs) is a HSLuv implementation for Emacs Lisp._
```emacs-lisp
(use-package hsluv
:after color-extras)
```
`image` {#image}
_[image](https://github.com/emacs-mirror/emacs/blob/master/lisp/image.el) provides image-manipulation functions._
```emacs-lisp
(use-feature image
:after image-mode
:init
;; Use imagemagick, if available.
;; djcbsoftware.nl/code/mu/mu4e/Viewing-images-inline.html
(when (fboundp 'imagemagick-register-types)
(imagemagick-register-types))
:bind
(:map image-mode-map
("+" . image-increase-size)
("-" . image-decrease-size)))
```
`image-mode` {#image-mode}
_[image-mode](https://github.com/emacs-mirror/emacs/blob/master/lisp/image-mode.el) is a major mode for viewing images._
```emacs-lisp
(use-feature image-mode
:bind
(:map image-mode-map
("c" . dired-extras-copy-image)))
```
`paren` {#paren}
_[paren](https://github.com/emacs-mirror/emacs/blob/e7260d4eb3ed1bebcaa9e2b934f162d4bb42e413/lisp/paren.el#L4) highlights matching parens._
```emacs-lisp
(use-feature paren
:custom
(show-paren-delay 0)
:config
(show-paren-mode))
```
`doom-modeline` {#doom-modeline}
_[doom-modeline](https://github.com/seagle0128/doom-modeline/) is a tidier and more aesthetically pleasing modeline._
I combine the modeline with the tab bar to display various types of information. Specifically, I use the modeline to display buffer-local information (such as the buffer major mode, line number, or word count), and the tab bar to display global information (such as the time and date, the weather, the computer’s battery status, and various notifications). This functionality is provided by a combination of the `=doom-modeline` package, the `tab-bar` feature, and my corresponding extensions (`doom-modeline-extras` and `tab-bar-extras`). In short, I move to the tab bar some of the elements that would normally be displayed in the modeline by (1) _enabling_ those elements via the relevant `doom-modeline` user options, (2) _hiding_ those elements via the `doom-modeline-def-modeline` macro, and (3) _adding_ equivalent elements to the tab bar via the `tab-bar-format` user option.
Here’s a screenshot illustrating the modeline and tab bar in action (click to enlarge):
{{< figure= src="/ox-hugo/screenshot-config.png">}}
```emacs-lisp
(use-package doom-modeline
:ensure (:build (:not elpaca-check-version))
:demand t
:custom
(doom-modeline-time nil) ; we display time (and date) in the tab bar
(doom-modeline-buffer-name t)
;; we display the full path in the header line via `breadcrumb'
(doom-modeline-buffer-file-name-style 'file-name)
(doom-modeline-check-simple-format t)
(doom-modeline-total-line-number t)
(doom-modeline-position-column-line-format '(" %c %l"))
(doom-modeline-enable-word-count t)
(doom-modeline-indent-info nil)
(doom-modeline-github t)
(doom-modeline-github-interval 60)
(doom-modeline-irc nil)
:config
(dolist (cons '((display-time-mode-hook . doom-modeline-override-time)
(doom-modeline-mode-hook . doom-modeline-override-time)))
(remove-hook (car cons) (cdr cons))))
```
`doom-modeline-extras` {#doom-modeline-extras}
_[doom-modeline-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/doom-modeline-extras.el) collects my extensions for `doom-modeline`._
```emacs-lisp
(use-personal-package doom-modeline-extras
:after doom-modeline
:demand t
:config
(doom-modeline-def-modeline 'main
'(bar workspace-name parrot buffer-info modals matches follow remote-host buffer-position tlon-paragraph word-count selection-info org-roam-backlinks)
'(tlon-split compilation objed-state misc-info persp-name grip irc mu4e gnus lsp minor-modes input-method indent-info buffer-encoding major-mode process vcs check time))
(doom-modeline-def-modeline 'vcs
'(bar window-number modals matches buffer-info remote-host buffer-position parrot selection-info)
'(compilation misc-info irc mu4e gnus minor-modes buffer-encoding major-mode process time))
(doom-modeline-def-modeline 'dashboard
'(bar window-number modals buffer-default-directory-simple remote-host)
'(compilation misc-info irc mu4e gnus minor-modes input-method major-mode process time))
(doom-modeline-def-modeline 'project
'(bar window-number modals buffer-default-directory remote-host buffer-position)
'(compilation misc-info irc mu4e gnus github minor-modes input-method major-mode process time))
(doom-modeline-mode))
```
`tab-bar` {#tab-bar}
_tab-bar displays a tab bar at the top of the frame, just below the tool bar._
```emacs-lisp
(use-feature tab-bar
:after faces-extras
:custom
(auto-resize-tab-bars t)
(tab-bar-format '(tab-bar-format-global))
:config
(setf mode-line-misc-info
;; When the tab-bar is active, don't show `global-mode-string'
;; in `mode-line-misc-info', because we now show that in the
;; tab-bar using `tab-bar-format-global'.
(remove '(global-mode-string ("" global-mode-string))
mode-line-misc-info))
:hook
(init-post-init-hook
. (lambda ()
"Set and store the tab bar attributes, then activate the tab bar."
(faces-extras-set-and-store-face-attributes
'((tab-bar :background (face-background 'mode-line-active)
:box `(:line-width 6 :color ,(face-attribute 'mode-line-active :background) :style nil))))
(tab-bar-mode))))
```
`tab-bar-extras` {#tab-bar-extras}
_[tab-bar-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/tab-bar-extras.el) collects my extensions for `tab-bar`._
```emacs-lisp
(use-personal-package tab-bar-extras
:config
(setq tab-bar-extras-global-mode-string
`(,tab-bar-extras-prefix-element
,tab-bar-extras-notification-status-element
;; ,tab-bar-extras-time-element
;; ,tab-bar-extras-separator-element
,tab-bar-extras-emacs-profile-element
;; ,tab-bar-extras-separator-element
;; ,tab-bar-extras-battery-element
,tab-bar-extras-telega-element
,tab-bar-extras-github-element
,tab-bar-extras-pomodoro-element
,tab-bar-extras-debug-element
;; we add a separator at the end because `wttr' appends itself after it
,tab-bar-extras-separator-element))
:hook
(init-post-init-hook
. (lambda ()
"Reset the tab shortly after startup to show all its elements correctly."
(run-with-timer 1 nil #'tab-bar-extras-quick-reset))))
```
`breadcrumb` {#breadcrumb}
_[breadcrumb](https://github.com/joaotavora/breadcrumb/) displays project information in the header line._
```emacs-lisp
(use-package breadcrumb
:custom
(breadcrumb-project-max-length 0.5)
(breadcrumb-project-crumb-separator "/")
(breadcrumb-imenu-max-length 1.0)
(breadcrumb-imenu-crumb-separator " > ")
:config
(breadcrumb-mode))
```
`battery` {#battery}
_[battery](https://github.com/emacs-mirror/emacs/blob/master/lisp/battery.el) displays battery status information._
```emacs-lisp
(use-feature battery
:defer t
:config
(display-battery-mode))
```
`nerd-icons` {#nerd-icons}
_[nerd-icons](https://github.com/rainstormstudio/nerd-icons.el) is a library for [Nerd Font](https://github.com/ryanoasis/nerd-fonts) icons inside Emacs._
Note that the icons need to be installed via `nerd-icons-install-fonts`. If you want to install the icons with `brew` on macOS, run `brew tap homebrew/cask-fonts && brew install --cask font-symbols-only-nerd-font`.
```emacs-lisp
(use-package nerd-icons
:defer t)
```
`menu-bar` {#menu-bar}
_[menu-bar](https://github.com/emacs-mirror/emacs/blob/master/lisp/menu-bar.el) defines the menu bar._
```emacs-lisp
(use-feature menu-bar
:config
(menu-bar-mode -1))
```
`tool-bar` {#tool-bar}
_[tool-bar](https://github.com/emacs-mirror/emacs/blob/master/lisp/tool-bar.el) provides the tool bar._
```emacs-lisp
(use-feature tool-bar
:config
(tool-bar-mode -1))
```
`scroll-bar` {#scroll-bar}
_[scroll-bar](https://github.com/emacs-mirror/emacs/blob/master/lisp/scroll-bar.el) handles window scroll bars._
```emacs-lisp
(use-feature scroll-bar
:config
(scroll-bar-mode -1))
```
`pixel-scroll` {#pixel-scroll}
_[pixel-scroll](https://github.com/emacs-mirror/emacs/blob/master/lisp/pixel-scroll.el) supports smooth scrolling._
```emacs-lisp
(use-feature pixel-scroll
:config
(pixel-scroll-precision-mode))
```
`delsel` {#delsel}
_[delsel](https://github.com/emacs-mirror/emacs/blob/master/lisp/delsel.el) deletes the selection when the user start typing._
```emacs-lisp
(use-feature delsel
:config
(delete-selection-mode))
```
`hl-line` {#hl-line}
_[hl-line](https://github.com/emacs-mirror/emacs/blob/master/lisp/hl-line.el) highlights the current line._
```emacs-lisp
(use-feature hl-line
:hook
(dired-mode-hook . hl-line-mode)
(ledger-reconcile-mode-hook . hl-line-mode))
```
`lin` {#lin}
_[lin](https://protesilaos.com/codelog/2022-09-08-lin-1-0-0/) is a stylistic enhancement for Emacs’ built-in `hl-line-mode`. It remaps the `hl-line` face (or equivalent) buffer-locally to a style optimal for major modes where line selection is the primary mode of interaction._
```emacs-lisp
(use-package lin
:custom
(lin-face 'lin-blue)
(lin-mode-hooks
'(dired-mode-hook
elfeed-search-mode-hook
git-rebase-mode-hook
grep-mode-hook
ibuffer-mode-hook
ilist-mode-hook
ledger-report-mode-hook
log-view-mode-hook
magit-log-mode-hook
mu4e-headers-mode
occur-mode-hook
org-agenda-mode-hook
pdf-outline-buffer-mode-hook
proced-mode-hook
tabulated-list-mode-hook))
:hook
(init-post-init-hook . lin-global-mode))
```
`jit-lock` {#jit-lock}
_[jit-lock](https://github.com/emacs-mirror/emacs/blob/master/lisp/jit-lock.el) provides just-in-time fontification._
I have [noticed](https://emacs.stackexchange.com/questions/72417/face-properties-fail-to-apply-to-parts-of-org-mode-buffer/72439#72439) that Emacs will sometimes fail to fontify parts of a buffer. This problem is solved, in my experience, by increasing the value of the user option `jit-lock-chunk-size`. Its docstring says that “The optimum value is a little over the typical number of buffer characters which fit in a typical window”, so we set its value dynamically by multiplying the number of lines per window by the number of characters per line, doubling that for safety.
```emacs-lisp
(use-feature jit-lock
:custom
(jit-lock-chunk-size
(* (window-max-chars-per-line) (window-body-height) 2)))
```
text movement {#text-movement}
words {#words}
_Keybindings for word-level movement._
```emacs-lisp
(use-feature simple
:bind
(("A-C-s-p" . forward-word)
("A-C-s-u" . backward-word)))
```
lines {#lines}
_Keybindings for line-level movement._
```emacs-lisp
(use-feature simple
:commands next-line previous-line
:init
(with-eval-after-load 'em-hist
(bind-keys :map eshell-hist-mode-map
("<up>" . previous-line)
("<down>" . next-line)))
(with-eval-after-load 'cus-edit
(bind-keys :map custom-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'ebib
(bind-keys :map ebib-entry-mode-map
("k" . previous-line)
("l" . next-line))
(bind-keys :map ebib-index-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'elfeed
(bind-keys :map elfeed-show-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'elisp-refs
(bind-keys :map elisp-refs-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'eww
(bind-keys :map eww-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'forge-notify
(bind-keys :map forge-notifications-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'help
(bind-keys :map help-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'helpful
(bind-keys :map helpful-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'info
(bind-keys :map Info-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'johnson
(bind-keys :map 'johnson-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'ledger-reconcile
(bind-keys :map ledger-reconcile-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'Man
(bind-keys :map Man-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'mu4e
(bind-keys :map mu4e-view-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'org-lint
(bind-keys :map org-lint--report-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'osa-chrome
(bind-keys :map osa-chrome-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'pass
(bind-keys :map pass-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'simple
(bind-keys :map special-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'slack
(bind-keys :map slack-message-buffer-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'slack
(bind-keys :map slack-activity-feed-buffer-mode-map
("k" . previous-line)
("l" . next-line)))
(with-eval-after-load 'wasabi
(bind-keys :map wasabi-mode-map
("k" . previous-line)
("l" . next-line)))
:bind
(("A-C-s-m" . move-beginning-of-line)
("A-C-s-/" . move-end-of-line)))
```
sentences {#sentences}
_Keybindings for sentence-level movement._
```emacs-lisp
(use-feature emacs
:bind
(("A-C-s-i" . backward-sentence)
("A-C-s-o" . forward-sentence)))
```
paragraphs {#paragraphs}
_Keybindings for paragraph-level movement._
```emacs-lisp
(use-feature emacs
:bind
(("A-C-s-," . backward-paragraph)
("A-C-s-." . forward-paragraph)))
```
sexps {#sexps}
_Keybindings for sexp-level movement._
```emacs-lisp
(use-feature emacs
:bind
(("A-C-s-e" . backward-sexp)
("A-H-M-s-d" . forward-sexp) ; nonstandard binding because otherwise intercepted by OSX
))
```
defuns {#defuns}
_Keybindings for defun-level movement._
```emacs-lisp
(use-feature emacs
:bind
(("A-C-s-w" . beginning-of-defun)
("A-C-s-s" . end-of-defun)))
```
buffers {#buffers}
_Keybindings for buffer-level movement._
```emacs-lisp
(use-feature simple
:bind
(("A-C-s-SPC" . beginning-of-buffer)
("A-C-s-<tab>" . end-of-buffer)))
```
text manipulation {#text-manipulation}
`simple` {#simple}
_[simple](https://github.com/emacs-mirror/emacs/blob/master/lisp/simple.el) configures the kill ring, transposing, and text manipulation commands._
```emacs-lisp
(use-feature simple
:custom
(kill-ring-max 99999)
(save-interprogram-paste-before-kill t) ; add system clipboard to kill ring
(auto-save-interval 5)
:bind
(("A-H-M-d" . transpose-chars)
("A-H-M-e" . transpose-sentences)
("A-H-M-f" . transpose-sexps)
("A-H-M-r" . transpose-words)
("A-H-M-v" . transpose-lines)
("C-k" . nil)
("C-<delete>" . nil)
("C-H-M-=" . overwrite-mode)
("C-H-M-a" . backward-kill-sexp)
("C-H-M-d" . delete-forward-char)
("C-H-M-e" . kill-sentence)
("C-H-M-f" . kill-sexp)
("C-H-M-f" . zap-to-char)
("C-H-M-g" . append-next-kill)
("C-H-M-q" . backward-kill-word)
("C-H-M-r" . kill-word)
("C-H-M-s" . delete-backward-char)
("M-SPC" . cycle-spacing)
("C-H-M-v" . kill-line)
("C-H-M-w" . backward-kill-sentence)
("C-H-M-z" . crux-kill-line-backwards)
("C-M-<backspace>" . nil)
("C-M-k" . nil)
("H-v" . yank)
("M-DEL" . nil)
("s-C" . nil)))
```
`simple-extras` {#simple-extras}
_[simple-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/simple-extras.el) collects my extensions for `simple`._
```emacs-lisp
(use-personal-package simple-extras
:demand t
:bind
(("A-C-H-a" . simple-extras-copy-whole-sexp)
("A-C-H-f" . simple-extras-delete-whole-sexp)
("A-C-H-M-S-s-a" . simple-extras-backward-delete-sexp)
("A-C-H-M-S-s-a" . simple-extras-backward-zap-delete-to-char)
("A-C-H-M-S-s-e" . simple-extras-delete-sentence)
("A-C-H-M-S-s-f" . simple-extras-delete-sexp)
("A-C-H-M-S-s-f" . simple-extras-zap-delete-to-char)
("A-C-H-M-S-s-q" . simple-extras-backward-delete-word)
("A-C-H-M-S-s-r" . simple-extras-delete-word)
("A-C-H-M-S-s-v" . simple-extras-delete-line)
("A-C-H-M-S-s-w" . simple-extras-backward-delete-sentence)
("A-C-H-M-S-s-z" . simple-extras-backward-delete-line)
("A-H-c" . simple-extras-count-words-dwim)
("A-C-H-e" . simple-extras-delete-whole-sentence)
("A-C-H-i" . simple-extras-kill-whole-sentence)
("A-C-H-m" . simple-extras-kill-whole-line)
("A-C-H-r" . simple-extras-delete-whole-word)
("A-C-H-u" . simple-extras-kill-whole-word)
("A-C-H-v" . simple-extras-delete-whole-line)
("A-C-H-w" . simple-extras-copy-whole-sentence)
("A-C-H-z" . simple-extras-copy-whole-line)
("A-H-M-a" . simple-extras-transpose-sexps-backward)
("A-H-M-q" . simple-extras-transpose-words-backward)
("A-H-M-s" . simple-extras-transpose-chars-backward)
("A-H-M-s-9" . simple-extras-copy-whole-word) ; `.-q'
("A-H-M-w" . simple-extras-transpose-sentences-backward)
("A-H-M-z" . simple-extras-transpose-lines-backward)
("A-M-f" . simple-extras-fill-or-unfill-paragraph)
("C-g" . simple-extras-keyboard-quit-dwim)
("C-H-M-a" . simple-extras-backward-zap-to-char)
("C-H-M-b" . simple-extras-strip-thing-at-point)
("C-H-M-s-A-a" . simple-extras-backward-copy-sexp)
("C-H-M-s-A-a" . simple-extras-backward-zap-copy-to-char)
("C-H-M-s-A-e" . simple-extras-copy-sentence)
("C-H-M-s-A-f" . simple-extras-copy-sexp)
("C-H-M-s-A-f" . simple-extras-zap-copy-to-char)
("C-H-M-s-A-q" . simple-extras-backward-copy-word)
("C-H-M-s-A-r" . simple-extras-copy-word)
("C-H-M-s-A-v" . simple-extras-copy-line)
("C-H-M-s-A-w" . simple-extras-backward-copy-sentence)
("C-H-M-s-A-z" . simple-extras-backward-copy-line)
("C-v" . simple-extras-paste-no-properties)
("C-w" . simple-extras-narrow-or-widen-dwim)
("H-A-v" . simple-extras-yank-and-pop)
("H-c" . simple-extras-smart-copy-region)
("H-M" . simple-extras-exchange-point-and-mark)
("H-X" . simple-extras-smart-delete-region)
("H-x" . simple-extras-smart-kill-region)
("M-A-i" . simple-extras-visual-line-mode-enhanced)
("M-i" . simple-extras-indent-dwim)
("M-q" . simple-extras-keyboard-quit-dwim)
("M-v" . simple-extras-visible-mode-enhanced)
:map isearch-mode-map
("C-w" . simple-extras-narrow-or-widen-dwim)))
```
`paragraphs` {#paragraphs}
_[paragraphs](https://github.com/emacs-mirror/emacs/blob/master/lisp/textmodes/paragraphs.el) configures paragraph manipulation, including sentence-end and paragraph keybindings._
```emacs-lisp
(use-feature emacs
:demand t
:commands kill-paragraph transpose-paragraphs backward-kill-paragraph
:custom
(sentence-end-double-space nil)
:init
(with-eval-after-load 'text-mode
(bind-keys :map text-mode-map
("C-H-M-c" . kill-paragraph)
("C-H-M-x" . backward-kill-paragraph)
("A-C-H-M-S-s-c" . simple-extras-delete-paragraph)
("A-C-H-M-S-s-x" . simple-extras-backward-delete-paragraph)
("C-H-M-s-A-c" . simple-extras-copy-paragraph)
("C-H-M-s-A-x" . simple-extras-backward-copy-paragraph)
("A-C-H-c" . simple-extras-delete-whole-paragraph)
("A-C-H-x" . simple-extras-copy-whole-paragraph)
("A-C-H-," . simple-extras-kill-whole-paragraph)
("A-H-M-c" . transpose-paragraphs)
("A-H-M-x" . simple-extras-transpose-paragraphs-backward)))
(with-eval-after-load 'org
(bind-keys :map org-mode-map
("C-H-M-c" . kill-paragraph)
("C-H-M-x" . backward-kill-paragraph)
("A-C-H-M-S-s-c" . simple-extras-delete-paragraph)
("A-C-H-M-S-s-x" . simple-extras-backward-delete-paragraph)
("C-H-M-s-A-c" . simple-extras-copy-paragraph)
("C-H-M-s-A-x" . simple-extras-backward-copy-paragraph)
("A-C-H-c" . simple-extras-delete-whole-paragraph)
("A-C-H-x" . simple-extras-copy-whole-paragraph)
("A-C-H-," . simple-extras-kill-whole-paragraph)
("A-H-M-c" . transpose-paragraphs)
("A-H-M-x" . simple-extras-transpose-paragraphs-backward)))
(with-eval-after-load 'outline
(bind-keys :map outline-mode-map
("C-H-M-c" . kill-paragraph)
("C-H-M-x" . backward-kill-paragraph)
("A-C-H-M-S-s-c" . simple-extras-delete-paragraph)
("A-C-H-M-S-s-x" . simple-extras-backward-delete-paragraph)
("C-H-M-s-A-c" . simple-extras-copy-paragraph)
("C-H-M-s-A-x" . simple-extras-backward-copy-paragraph)
("A-C-H-c" . simple-extras-delete-whole-paragraph)
("A-C-H-x" . simple-extras-copy-whole-paragraph)
("A-C-H-," . simple-extras-kill-whole-paragraph)
("A-H-M-c" . transpose-paragraphs)
("A-H-M-x" . simple-extras-transpose-paragraphs-backward)))
(with-eval-after-load 'telega
(bind-keys :map telega-chat-mode-map
("C-H-M-c" . kill-paragraph)
("C-H-M-x" . backward-kill-paragraph)
("A-C-H-M-S-s-c" . simple-extras-delete-paragraph)
("A-C-H-M-S-s-x" . simple-extras-backward-delete-paragraph)
("C-H-M-s-A-c" . simple-extras-copy-paragraph)
("C-H-M-s-A-x" . simple-extras-backward-copy-paragraph)
("A-C-H-c" . simple-extras-delete-whole-paragraph)
("A-C-H-x" . simple-extras-copy-whole-paragraph)
("A-C-H-," . simple-extras-kill-whole-paragraph)
("A-H-M-c" . transpose-paragraphs)
("A-H-M-x" . simple-extras-transpose-paragraphs-backward))))
```
editing {#editing}
`simple` {#simple}
_[simple](https://github.com/emacs-mirror/emacs/blob/master/lisp/simple.el) configures general editing behavior, including selection, indentation, and case conversion._
```emacs-lisp
(use-feature simple
:custom
(shift-select-mode nil) ; Shift keys do not activate the mark momentarily.
;; hide commands in M-x which do not apply to the current mode.
(read-extended-command-predicate #'command-completion-default-include-p)
(eval-expression-print-level nil)
(eval-expression-print-length nil)
(print-level nil)
(print-length nil)
(truncate-partial-width-windows nil)
(tab-always-indent 'complete)
:config
(setq-default fill-column 80)
:init
(column-number-mode)
:bind
(("C-e" . eval-last-sexp)
("H-C" . kill-ring-save)
("H-m" . set-mark-command)
("H-Z" . undo-redo)
("M-o" . downcase-dwim)
("M-u" . capitalize-dwim)
("A-M-u" . upcase-dwim)
("M-w" . count-words-region)
("H-z" . undo-only)))
```
`rect` {#rect}
_[rect](https://github.com/emacs-mirror/emacs/blob/master/lisp/rect.el) provides commands for operating on rectangular regions of text._
```emacs-lisp
(use-feature rect
:bind ("C-x r w" . copy-rectangle-as-kill))
```
`repeat` {#repeat}
_[repeat](https://github.com/emacs-mirror/emacs/blob/master/lisp/repeat.el) provides commands for repeating the previous command._
```emacs-lisp
(use-feature repeat
:bind
(("M-r" . repeat)
("A-M-r" . repeat-complex-command)))
```
`view` {#view}
_[view](https://github.com/emacs-mirror/emacs/blob/master/lisp/view.el) provides a minor mode for viewing files without editing them._
```emacs-lisp
(use-feature view
:bind
("M-A-v" . view-mode))
```
`sort` {#sort}
_[sort](https://github.com/emacs-mirror/emacs/blob/master/lisp/sort.el) provides commands for sorting text in a buffer._
```emacs-lisp
(use-feature sort
:custom
(sort-fold-case t)
:bind
("C-t" . sort-lines))
```
`vundo` {#vundo}
_[vundo](https://github.com/casouri/vundo) displays the undo history as a tree._
```emacs-lisp
(use-package vundo
:custom
(undo-limit (* 100 1000 1000))
(undo-strong-limit undo-limit)
(undo-outer-limit undo-limit)
:bind
(("A-z" . vundo)
:map vundo-mode-map
("j" . vundo-backward)
(";" . vundo-forward)
("k" . vundo-previous)
("l" . vundo-next)))
```
`outline` {#outline}
_[outline](https://github.com/emacs-mirror/emacs/blob/master/lisp/outline.el) provides selective display of portions of a buffer._
```emacs-lisp
(use-feature outline
:hook
(prog-mode-hook . outline-minor-mode)
:bind
(:map outline-mode-map
("TAB" . outline-cycle)
("<backtab>" . outline-cycle-buffer)
("A-C-s-r" . outline-previous-heading)
("A-C-s-f" . outline-next-heading)
("C-H-M-s-a" . outline-extras-promote-heading)
("C-H-M-s-s" . outline-move-subtree-up)
("C-H-M-s-d" . outline-move-subtree-down)
("C-H-M-s-f" . outline-extras-demote-heading)
("C-H-M-s-q" . outline-promote)
("C-H-M-s-r" . outline-demote)
:map outline-minor-mode-map
("TAB" . outline-cycle)
("<backtab>" . outline-cycle-buffer)
("A-C-s-r" . outline-previous-heading)
("A-C-s-f" . outline-next-heading)
("C-H-M-s-a" . outline-extras-promote-heading)
("C-H-M-s-s" . outline-move-subtree-up)
("C-H-M-s-d" . outline-move-subtree-down)
("C-H-M-s-f" . outline-extras-demote-heading)
("C-H-M-s-q" . outline-promote)
("C-H-M-s-r" . outline-demote)))
```
`outline-extras` {#outline-extras}
_[outline-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/outline-extras.el) collects my extensions for `outline`._
```emacs-lisp
(use-personal-package outline-extras
:after outline)
```
`outli` {#outli}
_[outli](https://github.com/jdtsmith/outli) is a simple comment-based outliner for Emacs._
```emacs-lisp
(use-package outli
:ensure (:host github
:repo "jdtsmith/outli")
:after outline
:custom
(outli-speed-commands
'(("Outline navigation")
("k" . outline-previous-visible-heading)
("." . outline-forward-same-level)
("," . outline-backward-same-level)
("l" . outline-next-visible-heading)
("m" . outline-up-heading)
("j" . consult-imenu)
("Outline structure editing")
("q" . outline-promote)
("a" . outline-extras-promote-heading)
("d" . outline-move-subtree-down)
("s" . outline-move-subtree-up)
("f" . outline-extras-demote-heading)
("r" . outline-demote)
("Outline visibility")
("<tab>" . outline-cycle)
("C" . outline-cycle-buffer)
("w" . outli-toggle-narrow-to-subtree)
("Regular editing")
("z" . undo-only)
("v" . yank)
("Other")
("?" . outli-speed-command-help)))
:hook emacs-lisp-mode-hook)
```
`thingatpt` {#thingatpt}
_thingatpt gets the “thing” at point._
```emacs-lisp
(use-feature thingatpt
:init
;; we redefine `thing-at-point-url-path-regexp' to support Japanese URLs
;; *after* `goto-addr' is loaded, so that `goto-address-url-regexp', which is
;; defined in reference to that user option, inherits the redefinition
(with-eval-after-load 'goto-addr
(setq thing-at-point-url-path-regexp "[^]\t\n \"'<>[^`{}、。！？]*[^]\t\n \"'<>[^`{}.,;、。！？]+")))
```
`abbrev` {#abbrev}
_[abbrev](https://github.com/emacs-mirror/emacs/blob/master/lisp/abbrev.el) provides automatic expansion of abbreviations as you type._
```emacs-lisp
(use-feature abbrev
:custom
(save-abbrevs 'silently)
(abbrev-file-name (file-name-concat paths-dir-abbrev "abbrev_defs"))
:config
(setq-default abbrev-mode t)
;; do not look up abbrevs with case folding; e.g. `EA' will not expand an `ea' abbrev
(abbrev-table-put global-abbrev-table :case-fixed t)
(abbrev-table-put text-mode-abbrev-table :case-fixed t))
```
`abbrev-extras` {#abbrev-extras}
_[abbrev-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/abbrev-extras.el) collects my extensions for `abbrev`._
```emacs-lisp
(use-personal-package abbrev-extras
:after abbrev)
```
`yasnippet` {#yasnippet}
_[yasnippet](https://github.com/joaotavora/yasnippet) is a template system for Emacs._
```emacs-lisp
(use-package yasnippet
:custom
(yas-snippet-dirs (list paths-dir-yasnippets
paths-dir-yasnippets-private
(file-name-concat elpaca-builds-directory "yasnippet-snippets/snippets/")))
(yas-triggers-in-field t) ; allow stacked expansions
(yas-new-snippet-default
(format "# -*- mode: snippet -*-\n# name: $1\n# key: $2\n# contributor: %s\n# --\n$0" user-full-name))
:config
;; Dropbox's file provider can leave snippet files as online-only
;; placeholders, causing `file-error' "Operation canceled" when
;; yasnippet tries to read them. Catch the error so that loading
;; one unavailable file does not prevent the mode from activating.
(advice-add 'yas--load-directory-2 :around
(lambda (fn &rest args)
(condition-case err
(apply fn args)
(file-error
(message "yasnippet: skipping unreadable directory %s: %s"
(car args) (error-message-string err))))))
(yas-global-mode)
:hook
(minibuffer-setup-hook . yas-minor-mode)
:bind
("C-y" . yas-new-snippet))
```
`yasnippet-snippets` {#yasnippet-snippets}
_[yasnippet-snippets](https://github.com/AndreaCrotti/yasnippet-snippets) is a public repository of yasnippet snippets._
```emacs-lisp
(use-package yasnippet-snippets
:after yasnippet)
```
`expand-region` {#expand-region}
_[expand-region](https://github.com/magnars/expand-region.el) incrementally selects regions by semantic units._
```emacs-lisp
(use-package expand-region
:bind
(("C-H-s-n" . er/expand-region)
("C-H-s-h" . er/contract-region)))
```
`newcomment` {#newcomment}
_[newcomment](https://github.com/emacs-mirror/emacs/blob/master/lisp/newcomment.el) provides commands for commenting and uncommenting code._
```emacs-lisp
(use-feature newcomment
:bind
("M-/" . comment-line))
```
`skeleton` {#skeleton}
_skeleton provides a concise language extension for writing structured statement skeleton insertion commands for programming modes._
The code block below specifies how certain characters should be paired either globally or in specific modes.
```emacs-lisp
(use-feature skeleton
:init
(setq skeleton-pair t)
(with-eval-after-load 'telega
(bind-keys :map telega-chat-mode-map
("~" . skeleton-pair-insert-maybe)
("=" . skeleton-pair-insert-maybe)))
(with-eval-after-load 'markdown-mode
(bind-keys :map markdown-mode-map
("*" . skeleton-pair-insert-maybe)
("`" . skeleton-pair-insert-maybe)))
(with-eval-after-load 'forge
(bind-keys :map forge-post-mode-map
("*" . skeleton-pair-insert-maybe)
("`" . skeleton-pair-insert-maybe)))
:hook
((markdown-mode-hook forge-post-mode-hook) .
(lambda ()
"Use two backticks, rather than ` and ', in selected modes."
(setq-local skeleton-pair-alist '((?` _ ?`)))))
:bind
(("[" . skeleton-pair-insert-maybe)
("{" . skeleton-pair-insert-maybe)
("(" . skeleton-pair-insert-maybe)
("\"" . skeleton-pair-insert-maybe)
("«" . skeleton-pair-insert-maybe)
:map org-mode-map
("~" . skeleton-pair-insert-maybe)
("=" . skeleton-pair-insert-maybe)
:map emacs-lisp-mode-map
("`" . skeleton-pair-insert-maybe)
:map lisp-interaction-mode-map
("`" . skeleton-pair-insert-maybe)))
```
`crux` {#crux}
_[crux](https://github.com/bbatsov/crux) is a “collection of ridiculously useful extensions”._
```emacs-lisp
(use-package crux
:init
(defun crux-smart-open-line-before ()
"Insert an empty line before the current line."
(interactive)
(crux-smart-open-line t))
:bind
(("M-l" . crux-smart-open-line)
("M-A-l" . crux-smart-open-line-before)
("A-H-l" . crux-duplicate-current-line-or-region)))
```
`button` {#button}
_[button](https://github.com/emacs-mirror/emacs/blob/master/lisp/button.el) defines functions for inserting and manipulating clickable buttons in Emacs buffers._
```emacs-lisp
(use-feature button
:after telega
:bind
(("A-C-M-s-j" . backward-button)
("A-C-M-s-;" . forward-button)
:map telega-chat-mode-map
("M-RET" . push-button)))
```
`back-button` {#back-button}
_[back-button](https://github.com/rolandwalker/back-button) supports navigating the mark ring forward and backward._
```emacs-lisp
(use-package back-button
:config
(back-button-mode)
:bind
(("H-," . back-button-local-backward)
("H-." . back-button-local-forward)
("H-<" .= back-button-global-backward)= ("H-=>" . back-button-global-forward)))
```
`goto-last-change` {#goto-last-change}
_[goto-last-change](https://github.com/camdez/goto-last-change.el) moves point through buffer-undo-list positions._
```emacs-lisp
(use-package goto-last-change
:bind
("C-z" . goto-last-change))
```
`goto-addr` {#goto-addr}
_[goto-addr](https://github.com/emacs-mirror/emacs/blob/master/lisp/net/goto-addr.el) activates URLs and e-mail addresses in buffers._
```emacs-lisp
(use-feature goto-addr
:config
(global-goto-address-mode))
```
registers &amp; bookmarks {#registers-and-bookmarks}
`register` {#register}
_[register](https://github.com/emacs-mirror/emacs/blob/master/lisp/register.el) saves text, rectangles, positions, and other things for later use._
```emacs-lisp
(use-feature register
:after savehist
:config
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'register-alist)))
```
`register-extras` {#register-extras}
_[register-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/register-extras.el) collects my extensions for `register`._
```emacs-lisp
(use-personal-package register-extras
:bind
("C-r" . register-extras-dispatch))
```
`bookmark` {#bookmark}
_[bookmark](https://github.com/emacs-mirror/emacs/blob/master/lisp/bookmark.el) provides a system for recording and jumping to named positions in files._
```emacs-lisp
(use-feature bookmark
:defer t
:custom
(bookmark-default-file paths-file-bookmarks) ; Set location of bookmarks file
(bookmark-save-flag 1)) ; Save bookmarks after each entry
```
files &amp; buffers {#files-and-buffers}
`files` {#files}
_[files](https://github.com/emacs-mirror/emacs/blob/master/lisp/files.el) provides core commands for visiting, saving, and managing files._
```emacs-lisp
(use-feature files
:after savehist
:custom
(confirm-kill-processes nil) ; do not prompt to kill running processes when quitting Emacs
(delete-by-moving-to-trash t)
(trash-directory (expand-file-name (file-name-concat "~" ".Trash"))) ; fallback for `move-file-to-trash'
(remote-file-name-inhibit-delete-by-moving-to-trash t)
(remote-file-name-inhibit-auto-save t)
(find-file-visit-truename t); emacs.stackexchange.com/questions/14509/kill-process-buffer-without-confirmation
(create-lockfiles nil) ; lockfiles are indexed by `org-roam', which causes problems with `org-agenda'
(large-file-warning-threshold (* 200 1000 1000))
(enable-local-variables :all)
(insert-directory-program "/opt/homebrew/bin/gls") ; use coreutils to avoid 'listing directory failed' error
(auto-save-no-message t)
(delete-old-versions t)
(make-backup-files nil)
(version-control 'never)
(auto-save-visited-interval 1)
(require-final-newline t)
(revert-without-query '(".*"))
:config
(setq kill-buffer-query-functions nil) ; not a customizable variable
;; we enable `auto-save-visited-mode' globally...
(auto-save-visited-mode)
;; ...but then we disable it for all buffers
(setq-default auto-save-visited-mode nil)
;; so that we can then re-enable it for specific buffers via a file-local variable:
;; # Local Variables:
;; # eval: (setq-local auto-save-visited-mode t)
;; # End:
(advice-add 'recover-session
:after (lambda ()
"Disable `dired-hide-details-mode' to show dates in `recover-session'."
(dired-hide-details-mode -1)))
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'file-name-history))
(add-to-list 'auto-mode-alist '("\\.mdx\\'" . markdown-mode))
(add-to-list 'safe-local-eval-forms
'(add-hook 'after-save-hook
(lambda ()
(require 'ox-texinfo)
(let ((inhibit-message t))
(org-texinfo-export-to-texinfo)))
nil t))
:bind
(("M--" . not-modified)
("H-a" . mark-whole-buffer)
("H-s" . save-buffer)
("C-b" . clone-indirect-buffer-other-window)
("H-C-g" . abort-recursive-edit)
("H-C-S-g" . top-level)
("H-C-A-g" . keyboard-escape-quit))) ; ESC ESC ESC
```
`files-extras` {#files-extras}
_[files-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/files-extras.el) collects my extensions for `files`._
```emacs-lisp
(use-personal-package files-extras
:bind
(("M-;" . files-extras-copy-current-path)
("H-f" . files-extras-dispatch)
("M-b" . files-extras-save-and-revert-buffer)
("M-e" . files-extras-eval-region-or-buffer)
("H-q" . files-extras-kill-this-buffer)
("A-H-M-s-q" . files-extras-kill-this-buffer-switch-to-other-window)
("A-H-q" . files-extras-kill-other-buffer)
("H-n" . files-extras-new-empty-buffer)
("A-H-n" . files-extras-new-buffer-in-current-mode)
("H-S" . files-extras-save-all-buffers)
("A-H-M-s-SPC" . files-extras-switch-to-alternate-buffer)
("A-H-v" . files-extras-internet-archive-dwim))
:init
(with-eval-after-load 'Info
(bind-keys :map Info-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'apropos
(bind-keys :map apropos-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'calendar
(bind-keys :map calendar-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'completion-list
(bind-keys :map completion-list-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'dired
(bind-keys :map dired-mode-map
("q" . files-extras-kill-this-buffer)
("s-o" . files-extras-ocr-pdf)))
(with-eval-after-load 'ebib
(bind-keys :map ebib-entry-mode-map
("v" . files-extras-internet-archive-dwim)
("q" . files-extras-bury-buffer-switch-to-other-window)
:map ebib-index-mode-map
("q" . files-extras-bury-buffer-switch-to-other-window)))
(with-eval-after-load 'elfeed
(bind-keys :map elfeed-show-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'finder
(bind-keys :map finder-mode-map
("q" . files-extras-kill-this-buffer)))
;; We typically enter these modes to lookup some information and
;; then return to the previous buffer, so we set `q' to switch to
;; the other window, and reserve `Q' for the normal behavior
(with-eval-after-load 'help
(bind-keys :map help-mode-map
("Q" . files-extras-kill-this-buffer)
("q" . files-extras-kill-this-buffer-switch-to-other-window)))
(with-eval-after-load 'helpful
(bind-keys :map helpful-mode-map
("Q" . files-extras-kill-this-buffer)
("q" . files-extras-kill-this-buffer-switch-to-other-window)))
(with-eval-after-load 'ledger-reconcile
(bind-keys :map ledger-reconcile-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'markdown-mode
(bind-keys :map markdown-mode-map
("s-g" . files-extras-grammarly-open-in-external-editor)
:map gfm-mode-map
("s-g" . files-extras-grammarly-open-in-external-editor)))
(with-eval-after-load 'mu4e
(bind-keys :map mu4e-headers-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'pass
(bind-keys :map pass-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'pdf-view
(bind-keys :map pdf-view-mode-map
("s-o" . files-extras-ocr-pdf)))
(with-eval-after-load 'simple
(bind-keys :map messages-buffer-mode-map
("q" . files-extras-bury-buffer-switch-to-other-window)))
(with-eval-after-load 'slack
(bind-keys :map slack-activity-feed-buffer-mode-map
("q" . files-extras-kill-this-buffer))
(bind-keys :map slack-message-buffer-mode-map
("q" . files-extras-kill-this-buffer))
(bind-keys :map slack-thread-message-buffer-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'special
(bind-keys :map special-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'telega
(bind-keys :map telega-root-mode-map
("q" . files-extras-bury-buffer-switch-to-other-window)))
(with-eval-after-load 'tetris
(bind-keys :map tetris-mode-map
("q" . files-extras-kill-this-buffer)))
(with-eval-after-load 'view
(bind-keys :map view-mode-map
("q" . files-extras-kill-this-buffer))))
```
`locate` {#locate}
_[locate](https://github.com/emacs-mirror/emacs/blob/master/lisp/locate.el) provides an interface for finding files using a locate database._
```emacs-lisp
(use-feature locate
:after consult
:custom
(locate-command "mdfind")) ; use the OSX Spotlight backend
```
`autorevert` {#autorevert}
_autorevert automatically reverts buffers when their files change on disk._
```emacs-lisp
(use-feature autorevert
:custom
(auto-revert-use-notify nil) ; reddit.com/r/emacs/comments/mq2znn/comment/gugo0n4/
(auto-revert-verbose nil)
:hook
(find-file-hook . global-auto-revert-mode))
```
`dired` {#dired}
_dired is the Emacs directory editor._
```emacs-lisp
(use-feature dired
:init
(with-eval-after-load 'pdf-annot
(bind-keys
:map pdf-annot-minor-mode-map
("x" . dired-jump)))
:custom
(dired-listing-switches "-AGFhlv --group-directories-first --time-style=long-iso")
(dired-auto-revert-buffer t)
(dired-recursive-copies 'always)
(dired-recursive-deletes 'always)
(dired-no-confirm t) ; never ask for confirmation
(dired-dwim-target t) ; if Dired buffer in other window, use that buffer's current directory as target
(dired-vc-rename-file t)
(dired-do-revert-buffer t)
(dired-create-destination-dirs 'ask)
(dired-guess-shell-alist-user '(("" "open")))
:config
(setq dired-deletion-confirmer (lambda (x) t)) ; not a customizable variable
(put 'dired-find-alternate-file 'disabled nil) ; do not disable dired-find-alternate-file!
:hook
(dired-mode-hook . dired-hide-details-mode)
(dired-mode-hook . (lambda () (visual-line-mode -1)))
:bind
(:map dired-mode-map
("<tab>" . dired-extras-subtree-toggle)
(";" . dired-do-rename)
("." . dired-find-alternate-file)
("C" . dired-do-copy)
("s-s" . dired-isearch-filenames)
("J" . dired-jump-other-window)
("e" . browse-url-extras-of-dired-file-externally)
("f" . avy-extras-dired-find-file)
("k" . dired-previous-line)
("l" . dired-next-line)
("r" . dired-toggle-read-only)
("H-z" . dired-undo)
("A-C-s-r" . dired-prev-dirline)
("A-C-s-f" . dired-next-dirline)
("A-C-s-," . dired-prev-marked-file)
("A-C-s-." . dired-next-marked-file)
("C-o" . nil)))
```
`dired-x` {#dired-x}
_[dired-x](https://github.com/emacs-mirror/emacs/blob/master/lisp/dired-x.el) provides extra Dired functionality._
```emacs-lisp
(use-feature dired-x
:custom
(dired-omit-verbose nil) ; shut up
(dired-omit-size-limit nil) ; always omit, regardless of directory size
:config
(setopt dired-omit-files
(concat dired-omit-files "\\|^.localized$\\|^\\.DS_Store$\\|^\\.pdf-view-restore\\|^Icon\\\015"))
:hook
(dired-mode-hook . dired-omit-mode))
```
`dired-extras` {#dired-extras}
_[dired-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/dired-extras.el) collects my extensions for `dired`._
```emacs-lisp
(use-personal-package dired-extras
:hook
(dired-mode-hook . (lambda () (require 'dired-extras)))
:bind
(("H-d" . dired-extras-dispatch)
:map dired-mode-map
("," . dired-extras-up-directory-reuse)
("-" . dired-extras-hide-details-mode-enhanced)
("H-." . dired-extras-dotfiles-toggle)
("c" . dired-extras-copy-filename-as-kill-absolute)
("w" . dired-extras-copy-filename-as-kill-dwim)
("W" . dired-extras-copy-filename-as-kill-sans-extension)
("z" . dired-extras-mark-screenshots)
("s" . dired-extras-sort-toggle-dwim)
("s-d" . dired-extras-do-delete-fast)
("s-r" . dired-extras-copy-to-remote-docs-directory)))
```
`dired-aux` {#dired-aux}
_[dired-aux](https://github.com/emacs-mirror/emacs/blob/master/lisp/dired-aux.el) provides auxiliary Dired functions for file operations like compression and diffing._
```emacs-lisp
(use-feature dired-aux
:after dired-x
:config
;; with `unar' installed, `Z' uncompresses `rar' files
(push '("\\.rar\\'" "" "unar") dired-compress-file-suffixes))
```
`dired-git-info` {#dired-git-info}
_[dired-git-info](https://github.com/clemera/dired-git-info) displays information about Git files in Dired._
```emacs-lisp
(use-package dired-git-info
:after dired
:custom
(dgi-commit-message-format "%s %cr %an")
:bind
(:map dired-mode-map
("b" . dired-git-info-mode)))
```
`dired-du` {#dired-du}
_[dired-du](https://github.com/calancha/dired-du) displays the recursive size of directories in Dired._
```emacs-lisp
(use-package dired-du
:after dired
:custom
(dired-du-size-format 'comma)
:bind
(:map dired-mode-map
("'" . dired-du-mode)))
```
`image-dired` {#image-dired}
_[image-dired](https://github.com/emacs-mirror/emacs/blob/master/lisp/image-dired.el) provides image viewing and thumbnail management in Dired._
```emacs-lisp
(use-feature image-dired
:after dired
:custom
(image-dired-main-image-directory (expand-file-name "~/Pictures/"))
(image-dired-external-viewer "open")
:bind
(:map image-dired-thumbnail-mode-map
("c" . dired-extras-image-copy-filename-as-kill-absolute)
("e" . image-dired-thumbnail-display-external)
("k" . image-dired-display-previous)
("l" . image-dired-display-next)
:map dired-mode-map
("I" . dired-extras-image-dired-current-directory)))
```
`nerd-icons-dired` {#nerd-icons-dired}
_[nerd-icons-dired](https://github.com/rainstormstudio/nerd-icons-dired) adds Dired support to nerd-icons._
```emacs-lisp
(use-package nerd-icons-dired
:after dired
:hook
(dired-mode-hook . nerd-icons-dired-mode))
```
`wdired` {#wdired}
_[wdired](https://github.com/emacs-mirror/emacs/blob/master/lisp/wdired.el) makes Dired buffers editable for renaming files and changing permissions._
```emacs-lisp
(use-feature wdired
:custom
(wdired-allow-to-change-permissions t)
:bind
(:map wdired-mode-map
("s-c" . wdired-finish-edit)
("<return>" . wdired-finish-edit)))
```
`gnus-dired` {#gnus-dired}
_gnus-dired provides utility functions for intersections of `gnus` and `dired`._
I use `mu4e` (see below) rather than `gnus` to handle email. However, a specific functionality in this feature also works with `mu4e`, allowing me to attach a file to an email directly from a Dired buffer.
```emacs-lisp
(use-feature gnus-dired
:after dired
:custom
;; enable `mu4e' attachments from `dired'
;; djcbsoftware.nl/code/mu/mu4e/Dired.html
(gnus-dired-mail-mode 'mu4e-user-agent)
:hook
(dired-mode-hook . turn-on-gnus-dired-mode)
:bind
(:map dired-mode-map
("s-a" . gnus-dired-attach)))
```
`dired-hacks` {#dired-hacks}
_[dired-hacks](https://github.com/Fuco1/dired-hacks) is a collection of useful dired additions._
```emacs-lisp
(use-package dired-hacks
:ensure (:host github
:repo "Fuco1/dired-hacks")
:after dired
:init
(advice-add 'dired-subtree-toggle :after (lambda () (dired-omit-mode) (dired-omit-mode)))
(advice-add 'dired-subtree-cycle :after (lambda () (dired-omit-mode) (dired-omit-mode)))
:bind
(:map dired-mode-map
("C-w" . dired-narrow-regexp)
("<tab>" . dired-subtree-toggle)
("<backtab>" . dired-subtree-cycle)))
```
`dired-quick-sort` {#dired-quick-sort}
_[dired-quick-sort](https://gitlab.com/xuhdev/dired-quick-sort) provides persistent quick sorting of Dired buffers in various ways._
```emacs-lisp
(use-package dired-quick-sort
:after dired
:bind
(:map dired-mode-map
("T" . dired-quick-sort-transient)))
```
`peep-dired` {#peep-dired}
_[peep-dired](https://github.com/asok/peep-dired) supports browsing file contents in other window while browsing directory in dired._
```emacs-lisp
(use-package peep-dired
:after dired
:bind
(:map dired-mode-map
("F" . peep-dired)))
```
`minibuffer` {#minibuffer}
_[minibuffer](https://github.com/emacs-mirror/emacs/blob/master/lisp/minibuffer.el) provides minibuffer completion and input facilities._
```emacs-lisp
(use-feature minibuffer
:custom
(enable-recursive-minibuffers t)
(resize-mini-windows t)
(completion-cycle-threshold 3) ; TAB cycle if there are only few candidates
(minibuffer-default-prompt-format " [%s]")
(minibuffer-electric-default-mode)
:bind
(:map minibuffer-mode-map
("M-k" . previous-history-element)
("M-l" . next-history-element)
("C-e" . search-query-replace)
("TAB" . yas-maybe-expand)
("s-i" . org-roam-node-insert)
("M-n" . nil)
("M-p" . nil)))
```
`ibuffer` {#ibuffer}
_[ibuffer](https://github.com/emacs-mirror/emacs/blob/master/lisp/ibuffer.el) provides an advanced, interactive buffer list._
```emacs-lisp
(use-feature ibuffer
:bind
(:map ibuffer-mode-map
("k" . ibuffer-do-delete)))
```
`prot-scratch` {#prot-scratch}
_[[<https://github.com/protesilaos/dotfiles/blob/master/emacs>_.emacs.d/prot-lisp/prot-scratch.el][prot-scratch]] supports scratch buffers for an editable major mode of choice._
```emacs-lisp
(use-package prot-scratch
:ensure (:host github
:repo "protesilaos/dotfiles"
:local-repo "prot-scratch"
:main "emacs/.emacs.d/prot-lisp/prot-scratch.el"
:build (:not elpaca-check-version)
:files ("emacs/.emacs.d/prot-lisp/prot-scratch.el"))
:custom
(prot-scratch-default-mode 'org-mode)
:bind
("C-n" . prot-scratch-buffer))
```
`persistent-scratch` {#persistent-scratch}
_[persistent-scratch](https://github.com/Fanael/persistent-scratch) preserves the scratch buffer across Emacs sessions._
I use this package in combination with `prot-scratch` (see above) to persist scratch buffers in different modes. This way, I am able to open a scratch buffer in _any_ mode for temporary notes, without running the risk of losing them.
```emacs-lisp
(use-package persistent-scratch
:custom
(persistent-scratch-autosave-interval 10)
(persistent-scratch-backup-directory
(no-littering-expand-var-file-name "auto-save/scratch-buffers/"))
:config
(persistent-scratch-setup-default))
```
`executable` {#executable}
_[executable](https://github.com/emacs-mirror/emacs/blob/master/lisp/progmodes/executable.el) provides base functionality for executable interpreter scripts._
```emacs-lisp
(use-feature executable
:hook
;; masteringemacs.org/article/script-files-executable-automatically
(after-save-hook . executable-make-buffer-file-executable-if-script-p))
```
`uniquify` {#uniquify}
_[uniquify](https://github.com/emacs-mirror/emacs/blob/master/lisp/uniquify.el) provides unique buffer names._
```emacs-lisp
(use-feature uniquify
:custom
(uniquify-buffer-name-style 'forward))
```
`reveal-in-osx-finder` {#reveal-in-osx-finder}
_[reveal-in-osx-finder](https://github.com/kaz-yos/reveal-in-osx-finder) lets you open the file at point or the current file-visiting buffer in OS X Finder._
```emacs-lisp
(use-package reveal-in-osx-finder
:after dired
:demand t
:bind (:map dired-mode-map
("/" . reveal-in-osx-finder)))
```
`tramp` {#tramp}
_[tramp](https://www.gnu.org/software/tramp/) is a remote file editing package for Emacs._
```emacs-lisp
(use-feature tramp
:after dired-x
:custom
(tramp-auto-save-directory no-littering-var-directory)
;; Disable version control on tramp buffers to avoid freezes.
(vc-ignore-dir-regexp
(format "\\(%s\\)\\|\\(%s\\)"
vc-ignore-dir-regexp
tramp-file-name-regexp))
;; Don't clean up recentf tramp buffers.
(recentf-auto-cleanup 'never)
;; This is supposedly [[https://www.emacswiki.org/emacs/TrampMode][faster than the default]], `scp'.
(tramp-default-method "sshx")
;; Store TRAMP auto-save files locally.
(tramp-auto-save-directory paths-dir-emacs-var)
;; A more representative name for this file.
(tramp-persistency-file-name (file-name-concat tramp-auto-save-directory "tramp-connection-history"))
;; Cache SSH passwords during the whole Emacs session.
(password-cache-expiry nil)
;; emacs.stackexchange.com/a/37855/32089
(remote-file-name-inhibit-cache nil)
:config
;; Reuse SSH connections. Taken from the TRAMP FAQ.
(customize-set-variable 'tramp-ssh-controlmaster-options
(concat
"-o ControlPath=/tmp/ssh-tramp-%%r@%%h:%%p "
"-o ControlMaster=auto -o ControlPersist=yes"))
;; This will put in effect PATH changes in the remote ~/.profile.
(add-to-list 'tramp-remote-path 'tramp-own-remote-path)
(advice-add 'projectile-project-root
:around (lambda ()
"Ignore remote files."
(unless (file-remote-p default-directory 'no-identification)
(apply orig-fun args))))
(add-to-list 'tramp-connection-properties
(list (regexp-quote "/ssh:fede@tlon.team:")
"direct-async-process" t)))
```
`pandoc-mode` {#pandoc-mode}
_[pandoc-mode](https://github.com/joostkremers/pandoc-mode) is a minor mode for interacting with Pandoc._
```emacs-lisp
(use-package pandoc-mode
:defer t)
```
track-changes {#track-changes}
_[track-changes](https://elpa.gnu.org/packages/track-changes.html) provides an API to react to buffer modifications._
I load this explicitly because the version bundled with Emacs 30.2 (1.2) has a known bug (debbugs#73041), where track-changes--before-beg/end can become inconsistent with the buffer size, triggering a cl-assertion-failed error on buffer modifications. The GNU ELPA version fixes this bug.
```emacs-lisp
(use-package track-changes
;; The default GNU ELPA recipe clones the full emacs-mirror/emacs repo
;; and uses a :files spec matching that layout. Use emacs-straight's
;; lightweight mirror instead, with a flat :files glob.
:ensure (:repo "https://github.com/emacs-straight/track-changes.git"
:files ("*" (:exclude ".git")))
:demand t)
```
windows &amp; frames {#windows-and-frames}
`window` {#window}
_[window](https://github.com/emacs-mirror/emacs/blob/master/lisp/window.el) provides commands for displaying buffers, scrolling, and managing window layout._
```emacs-lisp
(use-feature window
:init
(with-eval-after-load 'ebib
(bind-keys :map ebib-index-mode-map
("H-q" . bury-buffer)
:map ebib-entry-mode-map
("H-q" . bury-buffer)))
(with-eval-after-load 'simple
(bind-keys :map messages-buffer-mode-map
("H-q" . bury-buffer)))
(with-eval-after-load 'telega
(bind-keys :map 'telega-root-mode-map
("H-q" . bury-buffer)))
(with-eval-after-load 'elfeed
(bind-keys :map elfeed-show-mode-map
("y" . scroll-down-command)
("h" . scroll-up-command)))
(with-eval-after-load 'helpful
(bind-keys :map helpful-mode-map
("y" . scroll-down-command)
("h" . scroll-up-command)))
(with-eval-after-load 'mu4e
(bind-keys :map mu4e-view-mode-map
("y" . scroll-down-command)
("h" . scroll-up-command)))
(with-eval-after-load 'telega
(bind-keys :map telega-msg-button-map
("y" . scroll-down-command)
("h" . scroll-up-command)))
:custom
(split-height-threshold nil)
;; move point to top of buffer if `scroll-down-command' invoked when screen can scroll no further
(scroll-error-top-bottom t)
(split-width-threshold 200)
:config
;; we add `*ocr-pdf' buffer to list of buffers not to be displayed,
;; so that the process runs in the background`
(push '("*ocr-pdf*" display-buffer-no-window) display-buffer-alist)
;; The following prevents Emacs from splitting windows indefinitely when the monitor config changes
;; stackoverflow.com/questions/23207958/how-to-prevent-emacs-dired-from-splitting-frame-into-more-than-two-windows
(add-to-list 'display-buffer-alist `(,shell-command-buffer-name-async display-buffer-no-window))
(init-override-code
:window-split
'((add-hook 'elpaca-after-init-hook #'window-extras-split-if-unsplit)))
:bind
(("H-w" . delete-window)
("A-C-s-y" . scroll-down-command)
("A-C-s-h" . scroll-up-command)
("A-C-s-g" . scroll-other-window)
("A-C-s-t" . scroll-other-window-down)))
```
`window-extras` {#window-extras}
_[window-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/window-extras.el) collects my extensions for `window`._
```emacs-lisp
(use-personal-package window-extras
:bind
(("C-H-0" . window-extras-switch-to-last-window)
("A-C-H-0" . window-extras-switch-to-minibuffer-window)
("M-," . window-extras-buffer-move-left)
("M-." . window-extras-buffer-move-right)
("A-M--" . window-extras-buffer-swap) ; `emacs-mac'
("A-M-–" . window-extras-buffer-swap))) ; `emacs-plus'
```
`frame` {#frame}
_[frame](https://github.com/emacs-mirror/emacs/blob/master/lisp/frame.el) provides multi-frame management independent of window systems._
```emacs-lisp
(use-feature frame
:demand t
:custom
(window-divider-default-right-width 1)
:config
(blink-cursor-mode)
(window-divider-mode)
:bind
(("H-M-<tab>" . other-frame) ; M-S-TAB
("M-N" . make-frame)
("M-W" . delete-frame)))
```
`frame-extras` {#frame-extras}
_[frame-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/frame-extras.el) collects my extensions for `frame`._
```emacs-lisp
(use-personal-package frame-extras
:hook
(elpaca-after-init-hook . frame-extras-maximize-frame)
(spacious-padding-mode-hook . frame-extras-restore-window-divider)
:bind
;; the key bindings below are triggered via Karabiner
(("C-H-I" . frame-extras-maximize-frame)
("C-H-U" . frame-extras-left-half)
("C-H-P" . frame-extras-right-half)))
```
`posframe` {#posframe}
_[posframe](https://github.com/tumashu/posframe) supports displaying small popup frames._
```emacs-lisp
(use-package posframe)
```
`winum` {#winum}
_[winum-mode](https://github.com/deb0ch/emacs-winum) supports navigation of windows and frames using numbers._
```emacs-lisp
(use-package winum
:custom
(winum-scope 'frame-local)
:config
(winum-mode)
:bind
(("<C-m>" . winum-select-window-1)
("C-," . winum-select-window-2)
("C-." . winum-select-window-3)
("C-/" . winum-select-window-4)))
```
`winner` {#winner}
_[winner-mode](https://www.gnu.org/software/emacs/manual/html_node/emacs/Window-Convenience.html) is a global minor mode that records the changes in the window configuration (i.e., how the frames are partitioned into windows), so that you can undo them._
```emacs-lisp
(use-feature winner
:config
(remove-hook 'minibuffer-setup-hook 'winner-save-unconditionally)
(winner-mode)
:bind
("H-W" . winner-undo))
```
`popper` {#popper}
_[popper](https://github.com/karthink/popper) is a minor-mode to summon and dismiss buffers easily._
```emacs-lisp
(use-package popper
:init
(setopt popper-reference-buffers
'("\\*Warnings\\*"
"Output\\*$"
help-mode
helpful-mode
compilation-mode))
(popper-mode)
(popper-echo-mode)
:custom
(popper-display-control 'user) ; assumes buffer-specific behavior customized via `display-buffer-alist'
(popper-echo-dispatch-keys '(?a ?s ?d ?f ?j ?l ?r ?q ?w ?e ?r ?u ?i ?o ?p ?z ?x ?c ?v ?m ?, ?. ?/ ? ))
(popper-window-height (lambda (window)
"Set WINDOW to a size up to 33% of the frame height."
(fit-window-to-buffer
window
(floor (frame-height) 3))))
:bind
(("C-o" . popper-toggle)
("A-C-o" . popper-toggle-type)
("C-H-o" . popper-cycle)))
```
`avy` {#avy}
_[avy](https://github.com/abo-abo/avy) lets you jump to any visible text using a char-based decision tree._
```emacs-lisp
(use-package avy
:init
(with-eval-after-load 'ebib
(bind-keys :map ebib-entry-mode-map
("f" . avy-goto-line)))
(with-eval-after-load 'isearch
(bind-keys :map isearch-mode-map
("M-f" . avy-isearch)))
:custom
(avy-case-fold-search nil)
(avy-timeout-seconds 0.2)
(avy-all-windows nil)
(avy-keys (append '(?k) popper-echo-dispatch-keys))
:config
(setf (alist-get ?r avy-dispatch-alist) 'avy-extras-action-mark-to-char)
:bind
(("C-H-s-m" . avy-goto-line-above)
("C-H-s-." . avy-goto-line-below)
("C-H-s-k" . avy-goto-word-1-above)
("C-H-s-l" . avy-goto-word-1-below)))
```
`avy-extras` {#avy-extras}
_[avy-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/avy-extras.el) collects my extensions for `avy`._
```emacs-lisp
(use-personal-package avy-extras
:bind
(("C-H-s-u" . avy-extras-goto-word-in-line-behind)
("C-H-s-p" . avy-extras-goto-word-in-line-ahead)
("C-H-s-," . avy-extras-goto-end-of-line-above)
("C-H-s-/" . avy-extras-goto-end-of-line-below)
("C-H-s-j" . avy-extras-goto-char-backward)
("C-H-s-;" . avy-extras-goto-char-forward)))
```
`writeroom-mode` {#writeroom-mode}
_[writeroom-mode](https://github.com/joostkremers/writeroom-mode) provides distraction-free writing for Emacs._
```emacs-lisp
(use-package writeroom-mode
:custom
(writeroom-global-effects '(writeroom-set-fullscreen
writeroom-set-alpha
writeroom-set-menu-bar-lines
writeroom-set-tool-bar-lines
writeroom-set-vertical-scroll-bars
writeroom-set-bottom-divider-width
(lambda (arg) (tab-bar-mode (* -1 arg)))))
(writeroom-restore-window-config t) ; upon leaving `writeroom mode', restore pre-existing number of windows
(writeroom-major-modes '(org-mode
elfeed-search-mode
elfeed-show-mode
eww-mode
eww-buffers-mode)) ; major modes activated in global-writeroom-mode
(writeroom-fullscreen-effect 'maximized) ; disables annoying fullscreen transition effect on macos
(writeroom-maximize-window t)
:config
(advice-add 'writeroom-mode
:before (lambda (&rest args)
"Set `writeroom-width' to the width of the window in which it is invoked."
(setopt writeroom-width (window-total-width))))
:bind
("H-u" . writeroom-mode))
```
logos {#logos}
_[logos](https://protesilaos.com/emacs/logos) provides a simple focus mode for reading, writing, and presentation._
```emacs-lisp
(use-package logos
:demand t
:config
;; Treat outline headings (org *, elisp ;;;, markdown #) as pages
(setq logos-outlines-are-pages t)
;; Aesthetic tweaks for `logos-focus-mode' (all buffer-local)
(setq-default logos-hide-cursor nil
logos-hide-mode-line t
logos-hide-header-line t
logos-hide-buffer-boundaries t
logos-hide-fringe t
logos-variable-pitch nil
logos-buffer-read-only nil
logos-scroll-lock nil
logos-olivetti nil)
;; Recenter at top on page motion so each "slide" starts at the top
(defun ps/logos-recenter-top ()
"Use `recenter' to reposition the view at the top."
(unless (derived-mode-p 'prog-mode)
(recenter 0)))
(add-hook 'logos-page-motion-hook #'ps/logos-recenter-top)
:bind
(([remap narrow-to-region] . logos-narrow-dwim)
([remap forward-page] . logos-forward-page-dwim)
([remap backward-page] . logos-backward-page-dwim)
("<f9>" . logos-focus-mode)))
```
`ace-link` {#ace-link}
_[ace-link](https://github.com/abo-abo/ace-link) lets you quickly follow links in Emacs, Vimium-style._
```emacs-lisp
(use-package ace-link
:defer t)
```
`ace-link-extras` {#ace-link-extras}
_[ace-link-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/ace-link-extras.el) collects my extensions for `ace-link`._
```emacs-lisp
(use-personal-package ace-link-extras
:after ace-link
:config
(dolist (mode '(slack-message-buffer-mode
slack-thread-message-buffer-mode
slack-activity-feed-buffer-mode))
(push (list 'ace-link-extras-slack mode) ace-link-major-mode-actions)))
```
date &amp; time {#date-and-time}
`calendar` {#calendar}
_[calendar](https://github.com/emacs-mirror/emacs/blob/master/lisp/calendar/calendar.el) provides a collection of calendar-related functions._
```emacs-lisp
(use-feature calendar
:custom
(calendar-week-start-day 1) ; week starts on Monday
(calendar-set-date-style 'iso) ; this isn't the default?
(calendar-time-display-form
'(24-hours ":" minutes
(when time-zone
(concat " (" time-zone ")"))))
(calendar-mark-holidays-flag nil)
(calendar-time-zone-style 'numeric)
(holiday-bahai-holidays nil)
:bind
(("C-d" . calendar)
("s-=" . "C-u A-s-=")
:map calendar-mode-map
("H-m" . calendar-set-mark)
("A-C-s-u" . calendar-backward-day)
("A-C-s-i" . calendar-backward-week)
("A-C-s-o" . calendar-forward-week)
("A-C-s-p" . calendar-forward-day)
("A-C-s-m" . calendar-backward-month)
("A-C-s-," . calendar-backward-year)
("A-C-s-." . calendar-forward-year)
("A-C-s-/" . calendar-forward-month)
("C-f" . nil)
("C-b" . nil)
("C-n" . nil)
("C-p" . nil)
("=" . calendar-count-days-region)))
```
`calendar-extras` {#calendar-extras}
_[calendar-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/calendar-extras.el) collects my extensions for `calendar`._
```emacs-lisp
(use-personal-package calendar-extras
:after org-agenda
:custom
(calendar-extras-location-name "Buenos Aires")
(calendar-extras-use-geolocation t))
```
`holidays` {#holidays}
_[holidays](https://github.com/emacs-mirror/emacs/blob/e819413e24d81875abaf81c281115e695ad5cc28/lisp/calendar/holidays.el#L98) provides holiday functions for `calendar`._
```emacs-lisp
(use-feature holidays
:after org-agenda
:config
(dolist (holiday '((holiday-float 6 0 3 "Father's Day")
(holiday-float 5 0 2 "Mother's Day")))
(delete holiday holiday-general-holidays)))
```
`institution-calendar` {#institution-calendar}
_[institution-calendar](https://github.com/protesilaos/institution-calendar) augments the `calendar` buffer to include Oxford/Calendar term indicators._
```emacs-lisp
(use-package institution-calendar
:ensure (:host github
:repo "protesilaos/institution-calendar")
:defer t
:config
(institution-calendar-mode))
```
`org-gcal` {#org-gcal}
_[org-gcal](https://github.com/kidd/org-gcal.el) integrates `org-mode` with Google Calendar._
(That's the actively maintained fork; the [official repository](https://github.com/myuhe/org-gcal.el/issues/124#issuecomment-642859466) is no longer maintained.)
```emacs-lisp
(use-package org-gcal
:ensure (:host github
:repo "benthamite/org-gcal.el"
:branch "fix/strip-html-descriptions"
:build (:not elpaca-check-version)) ; https://github.com/kidd/org-gcal.el/pull/276
:after auth-source-pass org-agenda
:custom
(org-gcal-client-id (auth-source-pass-get "host" "auth-sources/org-gcal"))
(org-gcal-client-secret (auth-source-pass-get 'secret "auth-sources/org-gcal"))
(org-gcal-fetch-file-alist `((,(getenv "PERSONAL_GMAIL") . ,paths-file-calendar)
(,(getenv "EPOCH_EMAIL") . ,paths-file-calendar)))
(org-gcal-recurring-events-mode 'top-level)
(org-gcal-remove-api-cancelled-events nil) ; never remove cancelled events
(org-gcal-notify-p nil)
(org-gcal-up-days 1)
(org-gcal-down-days 7)
(org-gcal-auto-archive nil)
:config
;; see the relevant section in this config file for more details on how to set
;; up `org-gcal' with asymmetric encryption
(require 'plstore))
```
`org-gcal-extras` {#org-gcal-extras}
_[org-gcal-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-gcal-extras.el) collects my extensions for `org-gcal`._
```emacs-lisp
(use-personal-package org-gcal-extras
:after org-gcal
:demand t)
```
`calfw` {#calfw}
_[calf](https://github.com/haji-ali/emacs-calfw) is a calendar framework for Emacs._
The original package is no longer maintained. A [fork](https://github.com/haji-ali/emacs-calfw) by Abdul-Lateef Haji-Ali below added a few improvements. But that fork itself ceased to be maintained so I am now using my own fork.
```emacs-lisp
(use-package calfw
;; :ensure (:host github
;; :repo "benthamite/emacs-calfw")
:after org-agenda)
```
`calfw-org` {#calfw-org}
_[calfw-org](https://github.com/benthamite/emacs-calfw/blob/master/calfw-org.el) display org-agenda items in the calfw buffer._
```emacs-lisp
(use-package calfw-org
:after calfw org)
```
`calfw-blocks` {#calfw-blocks}
_[calfw-blocks](https://github.com/benthamite/calfw-blocks) provides visual enhancements for calfw._
The original package appears to no longer be maintained, so I have created my own fork.
```emacs-lisp
(use-package calfw-blocks
:ensure (calfw-blocks
:host github
:repo "benthamite/calfw-blocks")
:after calfw)
```
`time` {#time}
_[time](https://github.com/emacs-mirror/emacs/blob/e819413e24d81875abaf81c281115e695ad5cc28/lisp/time.el#L2) provides facilities to display the current date and time, and a new-mail indicator mode line._
```emacs-lisp
(use-feature time
:demand t
:custom
(zoneinfo-style-world-list '(("America/Buenos_Aires" "Buenos Aires")
("Europe/London" "London")
("Europe/Madrid" "Madrid")
("America/New_York" "New York")
("America/Los_Angeles" "San Francisco")
("Europe/Stockholm" "Stockholm")))
(world-clock-list t)
(world-clock-time-format "%R %z (%Z) %A %d %B")
(world-clock-buffer-name "*world-clock*")
(world-clock-timer-enable t)
(world-clock-timer-second 60)
(display-time-interval 1)
(display-time-format "%a %e %b %T %z")
(display-time-default-load-average nil)
:config
(display-time-mode)
:bind
("M-A-t" . world-clock))
```
`timer-list` {#timer-list}
_[timer-list](https://github.com/emacs-mirror/emacs/blob/master/lisp/emacs-lisp/timer-list.el) lists active timers in a tabulated buffer._
```emacs-lisp
(use-feature timer-list
:config
;; disable warning
(put 'list-timers 'disabled nil))
```
`tmr` {#tmr}
_[tmr](https://protesilaos.com/emacs/tmr) set timers using a convenient notation._
```emacs-lisp
(use-package tmr
:defer t
:config
(when (eq system-type 'darwin)
(setopt tmr-sound-file "/System/Library/Sounds/Blow.aiff")))
```
`display-wttr` {#display-wttr}
_[display-wttr](https://git.sr.ht/~josegpt/display-wttr) displays weather information in the mode line._
```emacs-lisp
(use-package display-wttr
:disabled ; triggering lots of errors
:after calendar-extras
:custom
(display-wttr-interval (* 15 60))
(display-wttr-locations `(,calendar-extras-location-name))
:config
(display-wttr-mode))
```
history {#history}
`savehist` {#savehist}
_[savehist](https://github.com/emacs-mirror/emacs/blob/e819413e24d81875abaf81c281115e695ad5cc28/lisp/savehist.el) makes Emacs remember completion history across sessions._
```emacs-lisp
(use-feature savehist
:custom
(history-length t) ; unlimited history
(savehist-save-minibuffer-history t)
:config
(savehist-mode))
```
`simple` {#simple}
_[simple](https://github.com/emacs-mirror/emacs/blob/master/lisp/simple.el) registers additional variables for persistence across sessions via savehist._
```emacs-lisp
(use-feature simple
:defer t
:config
(with-eval-after-load 'savehist
(dolist (var '(command-history
extended-command-history
kill-ring
mark-ring
shell-command-history
read-expression-history))
(add-to-list 'savehist-additional-variables var))))
```
`saveplace` {#saveplace}
_[saveplace](https://github.com/emacs-mirror/emacs/blob/e819413e24d81875abaf81c281115e695ad5cc28/lisp/saveplace.el) makes Emacs remember point position in file across sessions._
```emacs-lisp
(use-feature saveplace
:config
(save-place-mode))
```
`session` {#session}
_[session](https://github.com/emacsorphanage/session) lets you use variables, registers and buffer places across sessions._
```emacs-lisp
(use-package session
:custom
(session-globals-include '((kill-ring 100)
(session-file-alist 100 t)
(file-name-history 100)
search-ring regexp-search-ring))
:hook
(elpaca-after-init-hook . session-initialize))
```
`recentf` {#recentf}
_[recentf](https://github.com/emacs-mirror/emacs/blob/e819413e24d81875abaf81c281115e695ad5cc28/lisp/recentf.el) makes Emacs remember the most recently visited files._
```emacs-lisp
(use-feature recentf
:custom
(recentf-max-saved-items 100)
:config
;; github.com/emacscollective/no-littering#suggested-settings
(add-to-list 'recentf-exclude no-littering-var-directory)
(add-to-list 'recentf-exclude no-littering-etc-directory)
:hook find-file-hook)
```
search &amp; replace {#search-and-replace}
`elgrep` {#elgrep}
_[elgrep](https://github.com/TobiasZawada/elgrep) is an Emacs implementation of grep that requires no external dependencies._
```emacs-lisp
(use-package elgrep
:bind
(:map elgrep-mode-map
("r" . elgrep-edit-mode)
("s-c" . elgrep-save)))
```
`isearch` {#isearch}
_[isearch](https://github.com/emacs-mirror/emacs/blob/master/lisp/isearch.el) provides incremental search._
```emacs-lisp
(use-feature isearch
:custom
(search-default-mode #'char-fold-to-regexp)
(isearch-yank-on-move t)
(isearch-lazy-count t)
(lazy-count-prefix-format nil)
(lazy-count-suffix-format " (%s/%s)")
(isearch-allow-scroll 'unlimited)
(search-upper-case t)
(search-exit-option t) ; `t' is the default, but some alternative value may be more sensible
:config
(with-eval-after-load 'savehist
(dolist (var '(regexp-search-ring search-ring))
(add-to-list 'savehist-additional-variables var)))
:hook
(isearch-mode-end-hook . recenter-top-bottom)
:bind
(:map isearch-mode-map
("C-H-M-s" . isearch-delete-char)
("C-H-M-d" . "C-- C-H-M-s") ; delete forward char
("C-g" . isearch-abort) ; "quit once"
("C-H-g" . isearch-exit) ; "quit twice"
("C-. " . isearch-toggle-char-fold)
("C-," . isearch-forward-symbol-at-point)
("C-." . isearch-forward-thing-at-point)
("C-/" . isearch-complete)
("H-m" . isearch-toggle-lax-whitespace)
("C-a" . isearch-toggle-regexp)
("C-b" . isearch-beginning-of-buffer)
("C-d" . isearch-toggle-word)
("C-f" . isearch-highlight-lines-matching-regexp)
("C-i" . isearch-toggle-invisible)
("C-l" . isearch-yank-line)
("C-m" . isearch-toggle-symbol)
("C-n" . isearch-end-of-buffer)
("C-o" . isearch-occur)
("C-p" . isearch-highlight-regexp)
("C-v" . isearch-yank-kill)
("C-y" . isearch-forward-symbol-at-point)
("M-k" . isearch-ring-retreat)
("M-l" . isearch-ring-advance)
("C-e" . isearch-query-replace)))
```
To check: [Bridging Islands in Emacs: re-builder and query-replace-regexp | Karthinks](https://karthinks.com/software/bridging-islands-in-emacs-1/)
`isearch-extras` {#isearch-extras}
_[isearch-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/isearch-extras.el) collects my extensions for `isearch`._
```emacs-lisp
(use-personal-package isearch-extras
:config
(advice-add 'isearch-mode :around #'isearch-extras-use-selection)
:bind
(:map isearch-mode-map
("C-<return>" . isearch-extras-exit-other-end)
("H-c" . isearch-extras-copy-match)
("C-p" . isearch-extras-project-search)
("C-l" . isearch-extras-consult-line)
("C-H-v" . isearch-extras-yank-kill-literally)))
```
`replace` {#replace}
_[replace](https://github.com/emacs-mirror/emacs/blob/master/lisp/replace.el) provides search-and-replace commands._
```emacs-lisp
(use-feature replace
:custom
;; emacs.stackexchange.com/a/12318/32089
(query-replace-from-history-variable 'regexp-search-ring)
(case-replace nil)
:bind
(("C-H-a" . query-replace)
("C-H-s" . query-replace-regexp)))
```
`substitute` {#substitute}
_[substitute](https://git.sr.ht/~protesilaos/substitute) efficiently replaces targets in the buffer or context._
```emacs-lisp
(use-package substitute
:ensure (:host github
:repo "protesilaos/substitute")
:hook
(substitute-post-replace-functions . substitute-report-operation)
:bind
(("A-H-b" . substitute-target-in-buffer)
:map prog-mode-map
("A-H-d" . substitute-target-in-defun)))
```
`imenu` {#imenu}
_[imenu](https://github.com/emacs-mirror/emacs/blob/master/lisp/imenu.el) is a framework for mode-specific buffer indexes._
```emacs-lisp
(use-feature imenu
:defer t
:custom
(org-imenu-depth 3))
```
`pcre2el` {#pcre2el}
_[pcre2el](https://github.com/joddie/pcre2el) supports conversion between PCRE, Emacs and rx regexp syntax._
```emacs-lisp
(use-package pcre2el
:defer t)
```
`wgrep` {#wgrep}
_[wgrep](https://github.com/mhayashi1120/Emacs-wgrep) lets you create a writable grep buffer and apply the changes to files._
```emacs-lisp
(use-package wgrep
:custom
(wgrep-auto-save-buffer t)
(wgrep-enable-key "r")
:bind
(:map wgrep-mode-map
("s-c" . wgrep-finish-edit)))
```
minibuffer completion {#minibuffer-completion}
| package | what it does |
|------------|-----------------------------------|
| vertico | minibuffer completion UI |
| consult | minibuffer completion backend |
| orderless | minibuffer completion styles |
| marginalia | minibuffer completion annotations |
| embark | minibuffer completion actions |
For an introduction to minibuffer completion, I recommend [this video](https://www.youtube.com/watch?v=d3aaxOqwHhI) by Protesilaos Stavrou. For a comprehensive overview of both minibuffer completion and completion at point, I recommend [this video](https://www.youtube.com/watch?v=fnE0lXoe7Y0) by Andrew Tropin.
`bindings` {#bindings}
_[bindings](https://github.com/emacs-mirror/emacs/blob/master/lisp/bindings.el) defines standard key bindings and some variables._
```emacs-lisp
(use-feature bindings
:bind
("<C-i>" . complete-symbol))
```
`vertico` {#vertico}
_[vertico](https://github.com/minad/vertico) is a vertical completion UI based on the default completion system._
```emacs-lisp
(use-package vertico
:ensure (:files (:defaults "extensions/*")
:includes (vertico-indexed
vertico-flat
vertico-grid
vertico-mouse
vertico-quick
vertico-buffer
vertico-repeat
vertico-reverse
vertico-directory
vertico-multiform
vertico-unobtrusive))
:init
(vertico-mode)
:custom
(vertico-multiform-commands
'((consult-line buffer)
(consult-imenu buffer)
(consult-grep buffer)
(isearch-extras-consult-line buffer)))
(vertico-multiform-categories
'((grid)))
(vertico-cycle t)
(vertico-count 16)
:config
(vertico-multiform-mode)
:hook
;; youtu.be/L_4pLN0gXGI?t=779
(rfn-eshadow-update-overlay-hook . vertico-directory-tidy)
:bind
(:map vertico-map
("<C-i>" . vertico-exit)
("M-f" . vertico-quick-exit)
("C-k" . vertico-previous-group)
("C-l" . vertico-next-group)
("C-H-M-w" . vertico-directory-up)))
```
`embark` {#embark}
_[embark](https://github.com/oantolin/embark) provides contextually relevant actions in completion menus and in normal buffers._
```emacs-lisp
(use-package embark
:custom
(embark-confirm-act-all nil)
:config
(defvar-keymap embark-yasnippet-completion-actions
:doc "Keymap for actions on yasnippet completions."
:parent embark-general-map
"d" #'consult-yasnippet-visit-snippet-file)
(add-to-list 'embark-keymap-alist '(yasnippet . embark-yasnippet-completion-actions))
(keymap-set embark-general-map "?" #'gptel-quick)
(keymap-set embark-defun-map "R" #'gptel-extras-rewrite-defun)
:bind
(("C-;" . embark-act)
("C-H-;" . embark-act-all)
("C-h B" . embark-bindings)
:map embark-general-map
("DEL" . nil)
("D" . delete-region)
("f" . helpful-symbol)
:map embark-file-map
("D" . delete-region)
:map embark-general-map
("I" . embark-insert)
:map embark-identifier-map
("i" . citar-extras-open-in-ebib)
:map embark-file-map
("H-c" . files-extras-copy-contents)))
```
`consult` {#consult}
_[consult](https://github.com/minad/consult) provides practical commands based on the Emacs completion function `completing-read`._
```emacs-lisp
(use-package consult
:init
(with-eval-after-load 'helpful
(bind-keys :map helpful-mode-map
("s-j" . consult-outline)))
(with-eval-after-load 'markdown-mode
(bind-keys :map markdown-mode-map
("s-j" . consult-outline)))
(with-eval-after-load 'gfm-mode
(bind-keys :map gfm-mode-map
("s-j" . consult-outline)))
(with-eval-after-load 'outline
(bind-keys :map outline-mode-map
("s-j" . consult-outline)))
:custom
;; we call this wrapper to silence the annoying two lines of debug info that
;; `mdfind' outputs, which show briefly in the echo area and pollute the
;; `consult' search field. the file is in the `bin' directory of this repo.
(consult-locate-args "mdfind-wrapper")
(consult-narrow-key "<") (consult-widen-key= "=>")
(consult-grep-max-columns nil)
:config
(setopt consult-ripgrep-args (concat consult-ripgrep-args " --hidden")) ; include hidden files
:bind
(("C-H-l" . consult-line)
("C-f" . consult-find)
("s-j" . consult-imenu)
("H-b" . consult-buffer)
("H-B" . consult-project-buffer)
("A-H-i" . consult-info)
("H-R" . consult-history)
("H-V" . consult-yank-pop)))
```
`consult-extras` {#consult-extras}
_[consult-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/consult-extras.el) collects my extensions for `consult`._
```emacs-lisp
(use-personal-package consult-extras
:bind
(("H-F" . consult-extras-locate-file-current)
("H-k" . consult-extras-locate-current)
("H-p" . consult-extras-ripgrep-current)))
```
`consult-dir` {#consult-dir}
_[consult-dir](https://github.com/karthink/consult-dir) enables insertion of paths into the minibuffer prompt._
```emacs-lisp
(use-package consult-dir
:after consult
:custom
(consult-dir-default-command 'consult-dir-dired)
:bind
(:map minibuffer-mode-map
("H-d" . consult-dir)))
```
`consult-git-log-grep` {#consult-git-log-grep}
_[consult-git-log-grep](https://github.com/ghosty141/consult-git-log-grep) provides an interactive way to search the git log using `consult`._
```emacs-lisp
(use-package consult-git-log-grep
:after consult
:defer t)
```
`consult-yasnippet` {#consult-yasnippet}
_[consult-yasnippet](https://github.com/mohkale/consult-yasnippet/tree/cdb256d2c50e4f8473c6052e1009441b65b8f8ab) provides `consult` functionality to `yasnippet`._
```emacs-lisp
(use-package consult-yasnippet
:config
;; we delay previews to avoid accidentally triggering snippets that execute elisp code
(consult-customize consult-yasnippet :preview-key nil)
(add-to-list 'vertico-multiform-commands
'(consult-yasnippet grid))
:bind
("C-H-y" . consult-yasnippet))
```
`embark-consult` {#embark-consult}
_[embark-consult](https://github.com/oantolin/embark/blob/master/embark-consult.el) provides integration between `embark` and `consult`._
```emacs-lisp
(use-package embark-consult
:after embark consult)
```
`marginalia` {#marginalia}
_[marginalia](https://github.com/minad/marginalia) displays annotations (such as docstrings) next to completion candidates._
```emacs-lisp
(use-package marginalia
:init
(marginalia-mode))
```
`orderless` {#orderless}
_[orderless](https://github.com/oantolin/orderless) is an completion style that matches multiple regexps in any order._
```emacs-lisp
(use-package orderless
:custom
(completion-styles '(orderless basic partial-completion))
(completion-category-overrides '((file (styles basic partial-completion))))
(orderless-matching-styles '(orderless-regexp)))
```
`orderless-extras` {#orderless-extras}
_[orderless-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/orderless-extras.el) collects my extensions for `orderless`._
I define the following style dispatchers to extend `orderless` functionality:
| Suffix | Matching Style | Example | Description |
|--------|---------------------------|---------|----------------------------------------|
| ~ | orderless-flex | abc~ | Flex/fuzzy matching |
| , | orderless-initialism | abc, | Match initials (e.g., "abc" → "a-b-c") |
| ; | orderless-prefixes | abc; | Match word prefixes |
| ! | orderless-without-literal | !abc | Exclude matches containing pattern |
```emacs-lisp
(use-personal-package orderless-extras
:after orderless
:custom
(orderless-style-dispatchers '(orderless-extras-flex-dispatcher
orderless-extras-initialism-dispatcher
orderless-extras-prefixes-dispatcher
orderless-extras-exclusion-dispatcher)))
```
`affe` {#affe}
_[affe](https://github.com/minad/affe) is an Asynchronous Fuzzy Finder for Emacs._
```emacs-lisp
(use-package affe
:custom
;; https://github.com/minad/affe?tab=readme-ov-file#installation-and-configuration
(affe-regexp-compiler #'affe-orderless-regexp-compiler)
(affe-count 100)
:config
(defun affe-orderless-regexp-compiler (input _type _ignorecase)
(setq input (cdr (orderless-compile input)))
(cons input (apply-partially #'orderless--highlight input t)))
:bind
(("H-P" . affe-grep)
("H-K" . affe-find)))
```
`nerd-icons-completion` {#nerd-icons-completion}
_[nerd-icons-completion](https://github.com/rainstormstudio/nerd-icons-completion) displays nerd icons in completion candidates._
```emacs-lisp
(use-package nerd-icons-completion
:after marginalia
:config
(nerd-icons-completion-mode)
:hook
(marginalia-mode-hook . nerd-icons-completion-marginalia-setup))
```
`ido` {#ido}
_[ido](https://github.com/emacs-mirror/emacs/blob/master/lisp/ido.el) is a completion package for Emacs._
```emacs-lisp
(use-feature ido
:after dired
:config
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'ido-file-history))
:bind
(:map dired-mode-map
("i" . ido-find-file)))
```
`which-key` {#which-key}
_[which-key](https://github.com/justbur/emacs-which-key) displays available keybindings in a popup._
```emacs-lisp
(use-feature which-key
:custom
(which-key-idle-delay 0)
:config
(which-key-mode))
```
completion at point {#completion-at-point}
| package | what it does |
|---------|-----------------------------|
| corfu | completion at point UI |
| cape | completion at point backend |
`corfu` {#corfu}
_[corfu](https://github.com/minad/corfu) enhances completion at point with a small completion popup._
```emacs-lisp
(use-package corfu
:ensure (:files (:defaults "extensions/*")
:includes (corfu-info
corfu-echo
corfu-history
corfu-popupinfo
corfu-quick))
:after faces-extras
:custom
(corfu-auto t) ;; Enable auto completion
(corfu-quit-no-match t) ;; Automatically quit if there is no match
(corfu-cycle vertico-cycle)
(corfu-count vertico-count)
(corfu-auto-prefix 3)
(corfu-auto-delay 0.5)
(corfu-popupinfo-delay 0.1)
:config
(faces-extras-set-and-store-face-attributes
'((corfu-default :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)))
(global-corfu-mode)
(with-eval-after-load 'savehist
(add-to-list 'corfu-history 'savehist-additional-variables))
:hook
(prog-mode-hook . corfu-popupinfo-mode)
(prog-mode-hook . corfu-echo-mode)
(corfu-mode-hook . corfu-history-mode)
:bind
(:map corfu-map
("M-f" . corfu-quick-complete)
("TAB" . nil)
("<tab>" . nil)
("<return>" . corfu-complete)
("RET" . corfu-complete)))
```
`corfu-extras` {#corfu-extras}
_[corfu-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/corfu-extras.el) collects my extensions for `corfu`._
```emacs-lisp
(use-personal-package corfu-extras
:hook
(minibuffer-setup-hook . corfu-extras-enable-always-in-minibuffer)
:bind
(:map corfu-map
("M-m" . corfu-extras-move-to-minibuffer)))
```
`cape` {#cape}
_[cape](https://github.com/minad/cape) provides completion-at-point extensions_
```emacs-lisp
(use-package cape
:after corfu
:custom
(cape-dabbrev-min-length 4)
:config
(defun cape-enable-completions ()
"Enable file and emoji completion in the current buffer."
(setq-local completion-at-point-functions
(cons #'cape-file completion-at-point-functions)
completion-at-point-functions
(cons #'cape-emoji completion-at-point-functions)))
:hook
((text-mode-hook prog-mode-hook) . cape-enable-completions))
```
`corg` {#corg}
_[corg](https://github.com/isamert/corg.el) provides provides completion-at-point for org-mode source block and dynamic block headers._
```emacs-lisp
(use-package corg
:ensure (:host github
:repo "isamert/corg.el")
:hook
(org-mode-hook . corg-setup))
```
help {#help}
`help` {#help}
_[help](https://github.com/emacs-mirror/emacs/blob/master/lisp/help.el) is the built-in help system._
```emacs-lisp
(use-feature help
:custom
(help-window-select t)
:config
(lossage-size 10000)
:bind
(("C-h C-k" . describe-keymap)
("C-h C-." . display-local-help)
:map help-mode-map
("f" . ace-link-help)
:map input-decode-map
([?\C-m] . [C-m])
([?\C-i] . [C-i])))
```
`help-at-pt` {#help-at-pt}
_[help-at-pt](https://github.com/emacs-mirror/emacs/blob/master/lisp/help-at-pt.el) displays local help based on text properties at point._
```emacs-lisp
(use-feature help-at-pt
:custom
(help-at-pt-display-when-idle 'never)
(help-at-pt-timer-delay 1) ; show help immediately when enabled
:init
(help-at-pt-set-timer)) ; set timer, thus enabling local help
```
`helpful` {#helpful}
_[helpful](https://github.com/Wilfred/helpful) enhances the Emacs help buffer._
```emacs-lisp
(use-package helpful
:config
;; always use `helpful', even when `describe-function' is called by a program
;; (e.g. `transient')
(advice-add 'describe-function :override #'helpful-function)
:hook
(minibuffer-setup-hook . (lambda () (require 'helpful)))
:bind
(("C-h k" . helpful-key)
("C-h f" . helpful-function)
("C-h c" . helpful-command)
("C-h o" . helpful-symbol)
("C-h v" . helpful-variable)
("C-h ." . helpful-at-point)
:map helpful-mode-map
("f" . ace-link-help)
("w" . files-extras-copy-as-kill-dwim)))
```
`info` {#info}
_[info](https://github.com/emacs-mirror/emacs/blob/master/lisp/info.el) is the Info documentation browser._
```emacs-lisp
(use-feature info
:config
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'Info-history-list))
:bind
(:map Info-mode-map
("f" . ace-link-info)
("m" . Info-prev)
("/" . Info-next)
("," . Info-up)
("j" . Info-backward-node)
(";" . Info-forward-node)
("s-j" . Info-menu)))
```
`man` {#man}
_[man](https://github.com/emacs-mirror/emacs/blob/master/lisp/man.el) is a manual page viewer._
```emacs-lisp
(use-feature man
:bind
(:map Man-mode-map
("f" . ace-link-man)))
```
`woman` {#woman}
_[woman](https://github.com/emacs-mirror/emacs/blob/master/lisp/woman.el) browses manual pages without the man command._
```emacs-lisp
(use-feature woman
:bind
(:map woman-mode-map
("f" . ace-link-woman)))
```
`shortdoc` {#shortdoc}
_[shortdoc](https://github.com/emacs-mirror/emacs/blob/e7260d4eb3ed1bebcaa9e2b934f162d4bb42e413/lisp/emacs-lisp/shortdoc.el) provides short function summaries._
```emacs-lisp
(use-feature shortdoc
:bind
("C-h u" . shortdoc-display-group))
```
`find-func` {#find-func}
_find-func finds the definition of the Emacs Lisp function near point._
```emacs-lisp
(use-feature find-func
:bind
("M-L" . find-library))
```
`elisp-refs` {#elisp-refs}
_[elisp-refs](https://github.com/Wilfred/elisp-refs) finds references to functions, macros and variables in Elisp files._
```emacs-lisp
(use-feature elisp-refs
:bind (:map elisp-refs-mode-map
("f" . ace-link-help)))
```
`elisp-demos` {#elisp-demos}
_[elisp-demos](https://github.com/xuchunyang/elisp-demos) displays examples for many Elisp functions._
```emacs-lisp
(use-package elisp-demos
:after helpful
:init
(advice-add 'helpful-update :after 'elisp-demos-advice-helpful-update))
```
keyboard macros {#keyboard-macros}
`kmacro` {#kmacro}
_[kmacro](https://github.com/emacs-mirror/emacs/blob/master/lisp/kmacro.el) provides a simplified interface for keyboard macros._
```emacs-lisp
(use-feature kmacro
:config
(kmacro-set-counter 1)
(with-eval-after-load 'savehist
(dolist (var '(kmacro-ring last-kbd-macro))
(add-to-list 'savehist-additional-variables var)))
:bind
(("A-H-M-s-h" . kmacro-end-or-call-macro) ; = H-h, to circumvent OSX mapping
("H-H" . kmacro-start-macro-or-insert-counter)
("A-C-H-s-h" . kmacro-set-counter)
("M-h" . kmacro-edit-macro)
("M-A-h" . kmacro-bind-to-key)))
```
`kmacro-extras` {#kmacro-extras}
_[kmacro-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/kmacro-extras.el) collects my extensions for `kmacro`._
```emacs-lisp
(use-personal-package kmacro-extras
:bind
("C-A-h" . kmacro-extras-counter-toggle-alpha-number))
```
shell {#shell}
`simple` {#simple}
_[simple](https://github.com/emacs-mirror/emacs/blob/master/lisp/simple.el) configures shell command behaviour for interactive use._
```emacs-lisp
(use-feature simple
:custom
(shell-command-switch "-ic") ; https://stackoverflow.com/a/12229404/4479455
(async-shell-command-buffer 'new-buffer)) ; don't ask for confirmation before running command in a new buffer
```
`shell` {#shell}
_[shell](https://github.com/emacs-mirror/emacs/blob/master/lisp/shell.el) provides a shell-mode interface for inferior shell processes._
```emacs-lisp
(use-feature shell
:init
;; remove maddening "saving session" messages in non-interactive shells
(let ((filtered-env
(seq-filter
(lambda (var)
(let ((var-name (car (split-string var "="))))
(not (member var-name '("TERM_PROGRAM" "TERM_SESSION_ID")))))
process-environment)))
(setq process-environment filtered-env
shell-command-environment filtered-env))
:bind
(("A-s" . shell)
:map shell-mode-map
("M-p" . nil)
("M-n" . nil)
("M-k" . comint-previous-input)
("M-l" . comint-next-input)))
```
`eshell` {#eshell}
_[eshell](https://github.com/emacs-mirror/emacs/blob/master/lisp/eshell/eshell.el) is the Emacs shell, a shell implemented entirely in Emacs Lisp._
```emacs-lisp
(use-feature eshell
:after simple
:custom
(eshell-banner-message "")
(eshell-save-history-on-exit t)
(eshell-hist-ignoredups t)
(eshell-history-size 100000)
(eshell-last-dir-ring-size 1000)
:config
(require 'esh-mode)
:bind
(("A-e" . eshell)
:map eshell-mode-map
("<tab>" . yas-next-field-or-maybe-expand)
("TAB" . yas-next-field-or-maybe-expand) ; why is this necessary for eshell only?
("C-H-M-z" . eshell-kill-input)
("A-C-s-m" . beginning-of-line)
("M-k" . eshell-previous-matching-input-from-input)
("M-l" . eshell-next-matching-input-from-input)
("s-l" . eshell/clear)
("s-d" . eshell-send-eof-to-process)
("M-p" . nil)
("M-n" . nil)))
```
`em-hist` {#em-hist}
_[em-hist](https://github.com/emacs-mirror/emacs/blob/master/lisp/eshell/em-hist.el) provides history management for eshell._
```emacs-lisp
(use-feature em-hist
:defer t
:custom
(eshell-hist-ignoredups t)
(eshell-save-history-on-exit t))
```
`eshell-syntax-highlighting` {#eshell-syntax-highlighting}
_[eshell-syntax-highlighting](https://github.com/akreisher/eshell-syntax-highlighting) provides syntax highlighting for eshell-mode._
```emacs-lisp
(use-package eshell-syntax-highlighting
:after eshell
:hook
(eshell-mode-hook . eshell-syntax-highlighting-global-mode))
```
`dwim-shell-command` {#dwim-shell-command}
_[dwim-shell-command](https://github.com/xenodium/dwim-shell-command) supports Emacs shell commands with dwim behaviour._
```emacs-lisp
(use-package dwim-shell-command
:ensure (:host github
:repo "xenodium/dwim-shell-command")
:defer t)
```
`eat` {#eat}
_[eat](https://codeberg.org/akib/emacs-eat) is a terminal emulator._
```emacs-lisp
(use-package eat
:ensure (:host codeberg
:repo "akib/emacs-eat"
:files ("*.el" ("term" "term/*.el") "*.texi"
"*.ti" ("terminfo/e" "terminfo/e/*")
("terminfo/65" "terminfo/65/*")
("integration" "integration/*")
(:exclude ".dir-locals.el" "*-tests.el")))
:custom
(eat-term-name "xterm-256color")
:hook
(eshell-load-hook . eat-eshell-mode)
(eshell-load-hook . eat-eshell-visual-command-mode))
```
`eat-extras` {#eat-extras}
_[eat-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/eat-extras.el) collects my extensions for `eat`._
```emacs-lisp
(use-personal-package eat-extras
:after eat
:demand t
:hook
(eat-mode-hook . eat-extras-use-fixed-pitch-font)
:config
(eat-extras-setup-semi-char-mode-map))
```
`vterm` {#vterm}
_[vterm](https://github.com/akermu/emacs-libvterm) is another terminal emulator._
```emacs-lisp
(use-package vterm
:defer t
:custom
(vterm-always-compile-module t))
```
`vterm-extras` {#vterm-extras}
_[vterm-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/vterm-extras.el) collects my extensions for `vterm`._
```emacs-lisp
(use-personal-package vterm-extras
:after vterm
:demand t
:config
(vterm-extras-setup-keymap))
```
spelling &amp; grammar {#spelling-and-grammar}
`jinx` {#jinx}
_[jinx](https://github.com/minad/jinx) is a highly performant spell-checker for Emacs._
```emacs-lisp
(use-package jinx
:after faces-extras
:custom
(jinx-languages "en")
:config
(faces-extras-set-and-store-face-attributes
'((jinx-misspelled :underline '(:color "#008000" :style wave))))
(add-to-list 'vertico-multiform-categories
'(jinx grid (vertico-grid-annotate . 20)))
:hook
((text-mode-hook prog-mode-hook conf-mode-hook) . jinx-mode)
:bind
(("M-p" . jinx-correct)))
```
`jinx-extras` {#jinx-extras}
_[jinx-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/jinx-extras.el) collects my extensions for `jinx`._
```emacs-lisp
(use-personal-package jinx-extras
:after jinx
:bind
(("A-M-p" . jinx-extras-toggle-languages)))
```
`flycheck` {#flycheck}
_[flycheck](https://github.com/flycheck/flycheck) is a syntax-checker for Emacs._
```emacs-lisp
(use-package flycheck
:after faces-extras
:custom
;; move temporary flycheck files to a temporary directory
(flycheck-temp-prefix (concat temporary-file-directory "flycheck-"))
(flycheck-emacs-lisp-load-path 'inherit)
(flycheck-indication-mode nil)
(flycheck-display-errors-delay 0.5)
(flycheck-checker-error-threshold 10000)
;; org-lint runs synchronously in-process; it freezes Emacs on large org files
(flycheck-disabled-checkers '(org-lint))
;; https://github.com/skeeto/elfeed/pull/448#issuecomment-1120336279
(flycheck-global-modes '(not . (elfeed-search-mode)))
:config
(faces-extras-set-and-store-face-attributes
'((flycheck-error :underline '(:color "#ff0000" :style wave))
(flycheck-warning :underline '(:color "#0000ff" :style wave))))
:hook
(find-file-hook . global-flycheck-mode)
(org-src-mode-hook . (lambda ()
"Disable `emacs-lisp-checkdoc' in `org-src' blocks."
(setq-local flycheck-disabled-checkers '(emacs-lisp-checkdoc))))
(after-change-major-mode-hook . (lambda ()
"Disable flycheck in selected buffers."
(when (member (buffer-name) '("*scratch*" "notes"))
(flycheck-mode -1))))
:bind
("M-k" . flycheck-next-error))
```
`consult-flycheck` {#consult-flycheck}
_[consult-flycheck](https://github.com/minad/consult-flycheck) integrates `flycheck` with `consult`._
```emacs-lisp
(use-package consult-flycheck
:after consult flyckeck)
```
`flycheck-ledger` {#flycheck-ledger}
_[flycheck-ledger](https://github.com/purcell/flycheck-ledger) provides `flycheck` support for `ledger-mode`._
```emacs-lisp
(use-package flycheck-ledger
:after flycheck ledger-mode)
```
`flycheck-languagetool` {#flycheck-languagetool}
_[flycheck-languagetool](https://github.com/emacs-languagetool/flycheck-languagetool) provides `flycheck` support for [LanguageTool](https://languagetool.org/)._
```emacs-lisp
(use-package flycheck-languagetool
:ensure (:host github
:repo "benthamite/flycheck-languagetool"
:branch "fix/guard-stale-buffer-positions") ; https://github.com/emacs-languagetool/flycheck-languagetool/pull/40
:after flycheck
:init
(setopt flycheck-languagetool-server-jar
(expand-file-name (file-name-concat paths-dir-external-repos "LanguageTool/languagetool-server.jar")))
:custom
(flycheck-languagetool-check-params
'(("level" . "picky")
("disabledRules" . "ARROWS,DASH_RULE,DATE_NEW_YEAR,EN_QUOTES,GITHUB,WHITESPACE_RULE")))
:config
(defun flycheck-languagetool-enable ()
"Enable `flycheck-languagetool' in selected buffers."
(unless (or (derived-mode-p 'forge-post-mode
'gfm-mode
'mhtml-mode
'flycheck-error-message-mode
'mu4e-compose-mode
'mu4e-view-mode
'org-journal-mode
'org-msg-edit-mode)
(not (file-directory-p default-directory)))
(flycheck-select-checker 'languagetool)))
(defun flycheck-languagetool-toggle ()
"Toggle the LanguageTool checker in the current buffer."
(interactive)
(if (eq flycheck-checker 'languagetool)
(progn
(setq-local flycheck-checker nil)
(flycheck-buffer)
(message "LanguageTool disabled"))
(flycheck-select-checker 'languagetool)
(message "LanguageTool enabled")))
:hook
((markdown-mode-hook
org-mode-hook
org-msg-edit-mode-hook) . flycheck-languagetool-enable))
```
`flymake-mdl` {#flymake-mdl}
_[flymake-mdl](https://github.com/MicahElliott/flymake-mdl) provides a flymake backend for markdownlint._
```emacs-lisp
(use-package flymake-mdl
:ensure (:host github
:repo "MicahElliott/flymake-mdl")
:after markdown-mode
:demand t)
```
prose {#prose}
`text-mode` {#text-mode}
_[text-mode](https://github.com/emacs-mirror/emacs/blob/master/lisp/textmodes/text-mode.el) is the major mode for editing plain text._
```emacs-lisp
(use-feature text-mode
:hook
(text-mode-hook . simple-extras-visual-line-mode-enhanced)
(text-mode-hook . (lambda ()
"Disable ispell completion in text mode."
(remove-hook 'completion-at-point-functions #'ispell-completion-at-point t))))
```
`atomic-chrome` {#atomic-chrome}
_[atomic chrome](https://github.com/KarimAziev/atomic-chrome) enables editing of browser input fields in Emacs._
I use a fork that is better maintained, together with the associated [Chrome Emacs](https://github.com/KarimAziev/chrome-emacs) browser extension.
```emacs-lisp
(use-package atomic-chrome
:ensure (:repo "KarimAziev/atomic-chrome"
:host github)
:defer 30
:custom
(atomic-chrome-default-major-mode 'markdown-mode)
(atomic-chrome-create-file-strategy 'buffer) ; needed for proper recognition of modes
(atomic-chrome-url-major-mode-alist
'(("github\\.com" . gfm-mode)
("wikipedia\\.org" . mediawiki-mode)
("timelines\\.issarice\\.com" . mediawiki-mode)))
:config
(setq-default atomic-chrome-extension-type-list '(atomic-chrome))
(atomic-chrome-start-server)
:bind
(:map atomic-chrome-edit-mode-map
("s-c" . atomic-chrome-close-current-buffer)))
```
`markdown-mode` {#markdown-mode}
_[markdown-mode](https://github.com/jrblevin/markdown-mode) is a major mode for editing Markdown-formatted text._
```emacs-lisp
(use-package markdown-mode
:custom
(markdown-fontify-code-blocks-natively t)
(markdown-command "pandoc --from markdown --to html")
(markdown-disable-tooltip-prompt t)
(markdown-italic-underscore t)
:config
;; pop code block indirect buffers in the same window, mirroring the org behavior
(add-to-list 'display-buffer-alist
'("\\*edit-indirect.*\\*"
(display-buffer-same-window)))
:bind
(:map gfm-mode-map
("s-a" . markdown-insert-gfm-code-block)
("s-z" . markdown-edit-code-block)
("A-C-H-t" . markdown-mode-extras-copy-section)
("A-C-s-r" . markdown-outline-previous)
("A-C-s-f" . markdown-outline-next)
("M-p" . nil)
("A-s-f" . markdown-footnote-goto-text)
("A-s-r" . markdown-footnote-return)
("s-b" . markdown-insert-bold)
("s-e" . markdown-insert-code)
("s-f" . markdown-insert-footnote)
("s-i" . markdown-insert-italic)
("s-k" . markdown-insert-link)
("s-p" . markdown-preview)
:map markdown-mode-map
("s-a" . markdown-insert-gfm-code-block)
("s-z" . markdown-edit-code-block)
("A-C-H-t" . markdown-mode-extras-copy-section)
("A-C-s-r" . markdown-outline-previous)
("A-C-s-f" . markdown-outline-next)
("M-p" . nil)
("A-s-f" . markdown-footnote-goto-text)
("A-s-r" . markdown-footnote-return)
("s-b" . markdown-insert-bold)
("s-e" . markdown-insert-code)
("s-f" . markdown-insert-footnote)
("s-i" . markdown-insert-italic)
("s-k" . markdown-insert-link)
("s-p" . markdown-preview)))
```
`markdown-mode-extras` {#markdown-mode-extras}
_[markdown-mode-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/markdown-mode-extras.el) collects my extensions for `markdown-mode`._
```emacs-lisp
(use-personal-package markdown-mode-extras
:bind
(:map gfm-mode-map
("A-C-H-t" . markdown-mode-extras-copy-section)
("s-l" . markdown-mode-extras-insert-locator)
("s-r" . markdown-mode-extras-remove-url-in-link)
("s-v" . markdown-mode-extras-paste-with-conversion)
("H-s-v" . markdown-mode-extras-org-paste-dwim)
:map markdown-mode-map
("A-C-H-t" . markdown-mode-extras-copy-section)
("s-l" . markdown-mode-extras-insert-locator)
("s-r" . markdown-mode-extras-remove-url-in-link)
("s-v" . markdown-mode-extras-paste-with-conversion)
("H-s-v" . markdown-mode-extras-org-paste-dwim)
:map org-mode-map
("H-s-v" . markdown-mode-extras-org-paste-dwim)))
```
`grip-mode` {#grip-mode}
_[grip-mode](https://github.com/seagle0128/grip-mode) provides org-mode and Github-flavored Markdown preview using grip._
```emacs-lisp
(use-package grip-mode
:defer t
:init
(with-eval-after-load 'markdown-mode
(bind-keys :map markdown-mode-map
("s-w" . grip-mode)))
:custom
(grip-github-user (auth-source-pass-get "user" "tlon/core/api.github.com/grip-mode"))
(grip-github-password (auth-source-pass-get 'secret "tlon/core/api.github.com/grip-mode"))
:config
(require 'xwidget))
```
`xwidget` {#xwidget}
_[xwidget](https://github.com/emacs-mirror/emacs/blob/master/lisp/xwidget.el) provides API functions for xwidgets._
```emacs-lisp
(use-feature xwidget
:config
;; do not prompt user when attempting to kill xwidget buffer
(remove-hook 'kill-buffer-query-functions #'xwidget-kill-buffer-query-function)
:bind
(:map xwidget-webkit-mode-map
("," . xwidget-webkit-scroll-down)
("." . xwidget-webkit-scroll-up)
("j" . xwidget-webkit-scroll-top)
(";" . xwidget-webkit-scroll-bottom)))
```
`edit-indirect` {#edit-indirect}
_[edit-indirect](https://github.com/Fanael/edit-indirect) supports editing regions in separate buffers._
This package is required by the `markdown-mode` command `markdown-edit-code-block`.
```emacs-lisp
(use-package edit-indirect
:after markdown-mode
:bind (:map edit-indirect-mode-map
("s-z" . edit-indirect-commit)))
```
`mediawiki` {#mediawiki}
_[mediawiki](https://github.com/hexmode/mediawiki-el) is an Emacs interface to editing mediawiki sites._
```emacs-lisp
(use-package mediawiki
;; :ensure (:tag "2.4.3") ; otherwise can't authenticate; https://github.com/hexmode/mediawiki-el/issues/48
:custom
(mediawiki-site-alist `(("Wikipedia"
"https://en.wikipedia.org/w/"
"Sir Paul"
,(auth-source-pass-get 'secret "chrome/auth.wikimedia.org/Sir_Paul") ""
:description "English Wikipedia" :first-page "Main Page")))
(mediawiki-draft-data-file (file-name-concat paths-dir-notes "drafts.wiki"))
:bind
(:map mediawiki-mode-map
("s-k" . mediawiki-insert-link)
("A-C-s-r" . mediawiki-prev-header)
("A-C-s-f" . mediawiki-next-header)))
```
`wikipedia` {#wikipedia}
_[wikipedia](https://github.com/benthamite/wikipedia) is an Emacs interface for Wikipedia, with a focus on fast editing workflows, review tools, and optional local/offline browsing of watched pages._
```emacs-lisp
(use-package wikipedia
:ensure (:host github
:repo "benthamite/wikipedia"
:depth nil)
:defer t
:custom
(wikipedia-draft-directory (file-name-concat paths-dir-google-drive "wikipedia"))
(wikipedia-ai-model 'gemini-flash-lite-latest)
(wikipedia-auto-update-mode 1)
(wikipedia-ai-review-auto t)
(wikipedia-ai-summarize-auto t)
(wikipedia-watchlist-score-reason-auto t)
(wikipedia-watchlist-sort-by-score t)
:bind
(("A-p" . wikipedia-transient)))
```
`gdocs` {#gdocs}
_[gdocs](https://github.com/benthamite/gdocs) provides Emacs integration with Google Docs_
```emacs-lisp
(use-package gdocs
:ensure (:host github
:repo "benthamite/gdocs")
:defer t
:custom
(gdocs-accounts
`(("personal" . ((client-id . ,(auth-source-pass-get "gdocs-client-id" "chrome/cloud.google.com"))
(client-secret . ,(auth-source-pass-get "gdocs-client-secret" "chrome/cloud.google.com")))))))
```
`gdrive` {#gdrive}
_[gdrive](https://github.com/benthamite/gdrive) is an interface to Google Drive._
```emacs-lisp
(use-package gdrive
:ensure (:host github
:repo "benthamite/gdrive"
:depth nil)
:defer t
:init
(load-file (file-name-concat paths-dir-dotemacs "etc/gdrive-users.el"))
(defun gdrive-extras-mark-multiple-and-share (&optional file)
"Search for each line in FILE and share the selected results.
The file should contain one search query per line."
(interactive)
(require 'gdrive)
(gdrive-mark-clear)
(if-let ((file (or file (when (y-or-n-p "Read file? ")
(read-file-name "File: "))))
(lines (files-extras-lines-to-list file)))
(dolist (line lines)
(gdrive-extras--mark-matching-results line))
(gdrive-extras--mark-matching-results (read-string "ID: ")))
(gdrive-share-results gdrive-marked-files))
(defun gdrive-extras--mark-matching-results (string)
"Mark the files that match STRING."
(let ((results (gdrive-act-on-selected-search-results string)))
(gdrive-mark-results results))))
```
`ledger-mode` {#ledger-mode}
_[ledger-mode](https://github.com/ledger/ledger-mode) is a major mode for interacting with the Ledger accounting system._
To populate the database of historical prices:
- commodities:<https://github.com/LukasJoswiak/blog-code/blob/master/2020/tracking-commodity-prices-ledger/prices.py>
- accompanying post:<https://lukasjoswiak.com/tracking-commodity-prices-in-ledger/>
- crypto:<https://github.com/cjtapper/coinprices>
- currencies:<https://github.com/wakatara/get-FX>
- couldn't make it work, so I just entered the rates manually once and will use those
```emacs-lisp
(use-package ledger-mode
:custom
(ledger-default-date-format ledger-iso-date-format)
(ledger-reconcile-default-commodity "ARS")
(ledger-mode-extras-currencies '("USD" "EUR" "GBP" "ARS"))
(ledger-schedule-file paths-file-tlon-ledger-schedule-file)
(ledger-schedule-look-forward 0)
(ledger-schedule-look-backward 30)
:config
(dolist (report
'(("net worth"
"%(binary) --date-format '%Y-%m-%d' -f %(ledger-file) bal --strict")
("net worth (USD)"
"%(binary) --date-format '%Y-%m-%d' -f %(ledger-file) --price-db .pricedb --exchange USD bal ^assets ^liabilities --strict")
("account"
"%(binary) --date-format '%Y-%m-%d' -f %(ledger-file) reg %(account) --price-db .pricedb")
("account (USD)"
"%(binary) --date-format '%Y-%m-%d' -f %(ledger-file) reg %(account) --price-db .pricedb --exchange USD --limit \"commodity == 'USD'\"")
("cost basis"
"%(binary) --date-format '%Y-%m-%d' -f %(ledger-file) --basis bal %(account) --strict")
("account (unrounded)"
"%(binary) --date-format '%Y-%m-%d' --unround -f %(ledger-file) reg %(account)")))
(add-to-list 'ledger-reports report))
:hook
(ledger-reconcile-mode-hook . (lambda () (mouse-wheel-mode -1)))
:bind
(:map ledger-mode-map
("A-C-s-f" . ledger-navigate-next-xact-or-directive)
("A-C-s-r" . ledger-navigate-prev-xact-or-directive)
("A-s-e" . ledger-toggle-current-transaction)
("M-n" . nil)
("M-p" . nil)
("s-=" . ledger-reconcile)
("s-a" . ledger-add-transaction)
("s-b" . ledger-post-edit-amount)
("s-d" . ledger-delete-current-transaction)
("s-f" . ledger-occur)
("s-g" . ledger-report-goto)
("s-i" . ledger-insert-effective-date)
("s-k" . ledger-report-quit)
("s-l" . ledger-display-ledger-stats)
("s-o" . ledger-report-edit-report)
("s-p" . ledger-display-balance-at-point)
("s-q" . ledger-post-align-dwim)
("s-r" . ledger-report)
("s-s" . ledger-report-save)
("s-u" . ledger-schedule-upcoming)
("s-v" . ledger-copy-transaction-at-point)
("s-x" . ledger-fully-complete-xact)
("s-y" . ledger-copy-transaction-at-point)
("s-z" . ledger-report-redo)
:map ledger-report-mode-map
("A-C-s-f" . ledger-navigate-next-xact-or-directive)
("A-C-s-r" . ledger-navigate-prev-xact-or-directive)
:map ledger-reconcile-mode-map
("q" . ledger-reconcile-quit)))
```
`ledger-mode-extras` {#ledger-mode-extras}
_[ledger-mode-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/ledger-mode-extras.el) collects my extensions for `ledger-mode`._
```emacs-lisp
(use-personal-package ledger-mode-extras
:after ledger-mode
:demand t
:bind
(:map ledger-mode-map
("s-SPC" . ledger-mode-extras-new-entry-below)
("s-t" . ledger-mode-extras-sort-region-or-buffer)
("s-e" . ledger-mode-extras-extras-sort-region-or-buffer-reversed)
("A-s-a" . ledger-mode-extras-report-account)
("A-s-b" . ledger-mode-extras-decrease-date-by-one-day)
("A-s-c" . ledger-mode-extras-copy-transaction-at-point)
("A-s-f" . ledger-mode-extras-increase-date-by-one-day)
("A-s-t" . ledger-mode-extras-sort-region-or-buffer-reversed)
("A-s-u" . ledger-mode-extras-report-net-worth-USD)
("A-s-w" . ledger-mode-extras-report-net-worth)
("s-c" . ledger-mode-extras-align-and-next)
("A-s-c" . ledger-mode-extras-copy-transaction-at-point)
("s-x" . ledger-mode-extras-kill-transaction-at-point)))
```
translation {#translation}
`tlon` {#tlon}
_[tlon](https://github.com/tlon-team/tlon) is a set of Emacs commands that my team uses in various contexts._
```emacs-lisp
(use-package tlon
:ensure (:host github
:repo "tlon-team/tlon.el"
:depth nil) ; clone entire repo, not just last commit
:after paths
:init
(with-eval-after-load 'forge
(bind-keys :map forge-topic-mode-map
("," . tlon-visit-counterpart-or-capture)
("'" . tlon-open-forge-file)))
(with-eval-after-load 'magit
(bind-keys :map magit-status-mode-map
("," . tlon-visit-counterpart-or-capture)
("'" . tlon-open-forge-file)))
(with-eval-after-load 'markdown-mode
(bind-keys :map markdown-mode-map
("H-;" . tlon-md-menu)
("A-s-e" . tlon-yaml-edit-field)
("s-c" . tlon-copy-dwim)
("s-l" . tlon-md-insert-locator)
("s-u" . tlon-md-insert-entity)
("A-C-s-SPC" . tlon-md-beginning-of-buffer-dwim)
("A-C-s-<tab>" . tlon-md-end-of-buffer-dwim)
:map gfm-mode-map
("H-;" . tlon-md-menu)
("A-s-e" . tlon-yaml-edit-field)
("s-c" . tlon-copy-dwim)
("s-l" . tlon-md-insert-locator)
("s-u" . tlon-md-insert-entity)
("A-C-s-SPC" . tlon-md-beginning-of-buffer-dwim)
("A-C-s-<tab>" . tlon-md-end-of-buffer-dwim)))
:custom
(tlon-forg-archive-todo-on-close t)
(tlon-forg-sort-after-sync-or-capture t)
:config
(run-with-idle-timer (* 60 60) t #'tlon-pull-issues-in-all-repos)
(with-eval-after-load 'tlon-db
(defun tlon-db-extras--suppress-debugger (orig-fun &rest args)
"Call ORIG-FUN with ARGS with `debug-on-error' bound to nil.
When `tlon-db-get-entries-no-confirm' runs from its idle timer,
`url-retrieve-synchronously' calls `accept-process-output', which
can fire sentinels for unrelated connections (e.g. ghub/Forge).
If those sentinels signal an error while `debug-on-error' is t,
the debugger's `recursive-edit' freezes Emacs inside the timer."
(let ((debug-on-error nil))
(condition-case err
(apply orig-fun args)
(error
(message "tlon-db: periodic refresh failed: %s"
(error-message-string err))))))
(advice-add 'tlon-db-get-entries-no-confirm :around
'tlon-db-extras--suppress-debugger))
:hook
(init-post-init-hook . tlon-initialize)
:bind
(("M-j" . tlon-node-find)
("H-r" . tlon-dispatch)
("H-?" . tlon-mdx-insert-cite)
("A-C-p" . tlon-grep)))
```
`johnson` {#johnson}
_[johnson](https://github.com/benthamite/johnson) is a multi-format dictionary UI for Emacs, providing the functionality of programs such as GoldenDict._
```emacs-lisp
(use-package johnson
:ensure (:host github
:repo "benthamite/johnson")
:custom
(johnson-dictionary-directories '("/Users/pablostafforini/My Drive/Dictionaries/"))
:bind
(("A-o" . johnson-lookup)
:map johnson-mode-map
("," . johnson-prev-section)
("." . johnson-next-section)
("f" . johnson-ace-link)))
```
`go-translate` {#go-translate}
_[go-translate](https://github.com/lorniu/go-translate) is an Emacs translator that supports multiple translation engines._
```emacs-lisp
(use-package go-translate
:disabled
:custom
(gts-translate-list '(("en" "es")))
(gts-default-translator
(gts-translator
:picker ; used to pick source text, from, to. choose one.
;;(gts-noprompt-picker)
;;(gts-noprompt-picker :texter (gts-whole-buffer-texter))
(gts-prompt-picker)
;;(gts-prompt-picker :single t)
;;(gts-prompt-picker :texter (gts-current-or-selection-texter) :single t)
:engines ; engines, one or more. Provide a parser to give different output.
(list
;; (gts-bing-engine)
;;(gts-google-engine)
;;(gts-google-rpc-engine)
(gts-deepl-engine :auth-key (auth-source-pass-get "key" (concat "tlon/babel/deepl.com/" tlon-email-shared)) :pro nil)
;; (gts-google-engine :parser (gts-google-summary-parser))
;;(gts-google-engine :parser (gts-google-parser))
;;(gts-google-rpc-engine :parser (gts-google-rpc-summary-parser) :url "https://translate.google.com")
;; (gts-google-rpc-engine :parser (gts-google-rpc-parser) :url "https://translate.google.com")
)
:render ; render, only one, used to consumer the output result. Install posframe yourself when use gts-posframe-xxx
;; (gts-buffer-render)
;;(gts-posframe-pop-render)
;;(gts-posframe-pop-render :backcolor "#333333" :forecolor "#ffffff")
;; (gts-posframe-pin-render)
;;(gts-posframe-pin-render :position (cons 1200 20))
;;(gts-posframe-pin-render :width 80 :height 25 :position (cons 1000 20) :forecolor "#ffffff" :backcolor "#111111")
(gts-kill-ring-render)
:splitter ; optional, used to split text into several parts, and the translation result will be a list.
(gts-paragraph-splitter))))
```
`powerthesaurus` {#powerthesaurus}
_[powerthesaurus](https://github.com/SavchenkoValeriy/emacs-powerthesaurus) is an Emacs client for [power thesaurus](https://www.powerthesaurus.org/)._
```emacs-lisp
(use-package powerthesaurus
:bind
("H-y" . powerthesaurus-transient))
```
`reverso` {#reverso}
_[reverso](https://github.com/SqrtMinusOne/reverso.el) is an Emacs client for [reverso](https://www.reverso.net/)._
```emacs-lisp
(use-package reverso
:ensure (:host github
:repo "SqrtMinusOne/reverso.el")
:custom
(reverso-languages '(spanish english french german italian portuguese))
:bind
("H-Y" . reverso))
```
`dictionary` {#dictionary}
_dictionary is a client for rfc2229 dictionary servers._
```emacs-lisp
(use-feature dictionary
:defer t
:custom
(dictionary-server "dict.org")
(dictionary-use-single-buffer t))
```
docs {#docs}
`pdf-tools` {#pdf-tools}
_[pdf-tools](https://github.com/vedang/pdf-tools) is a support library for PDF files._
```emacs-lisp
(use-package pdf-tools
:mode ("\\.pdf\\'" . pdf-view-mode)
:after (:any dired-extras ebib)
:init
(pdf-tools-install t)
(with-eval-after-load 'pdf-annot
(bind-keys :map pdf-annot-minor-mode-map
("e" . pdf-annot-add-highlight-markup-annotation)
("j" . pdf-view-goto-page)
("k" . pdf-view-previous-line-or-previous-page)
("l" . pdf-view-next-line-or-next-page)
("H-c" . pdf-view-kill-ring-save)
("A-u" . pdf-view-midnight-minor-mode)))
(with-eval-after-load 'pdf-history
(bind-keys :map pdf-history-minor-mode-map
("e" . pdf-annot-add-highlight-markup-annotation)
("j" . pdf-view-goto-page)
("k" . pdf-view-previous-line-or-previous-page)
("l" . pdf-view-next-line-or-next-page)
("H-c" . pdf-view-kill-ring-save)
("A-u" . pdf-view-midnight-minor-mode)))
:custom
(pdf-view-use-scaling t)
(pdf-view-use-imagemagick nil)
(pdf-view-resize-factor 1.01)
(pdf-annot-default-annotation-properties
'((t
(label . user-full-name))
(text
(color . "#ff0000")
(icon . "Note"))
(highlight
(color . "LightBlue2"))
(underline
(color . "blue"))
(squiggly
(color . "orange"))
(strike-out
(color . "red"))))
:config
(pdf-cache-prefetch-minor-mode -1) ; https://github.com/vedang/pdf-tools/issues/278#issuecomment-2096894629
:hook
(pdf-view-mode-hook . pdf-view-fit-page-to-window)
:bind
(:map pdf-view-mode-map
("s" . save-buffer)
("e" . pdf-annot-add-highlight-markup-annotation)
("j" . pdf-view-goto-page)
("k" . pdf-view-previous-line-or-previous-page)
("l" . pdf-view-next-line-or-next-page)
("H-c" . pdf-view-kill-ring-save)
("A-u" . pdf-view-midnight-minor-mode)))
```
`pdf-tools-extras` {#pdf-tools-extras}
_[pdf-tools-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/pdf-tools-extras.el) collects my extensions for `pdf-tools`._
```emacs-lisp
(use-personal-package pdf-tools-extras
:init
(with-eval-after-load 'pdf-annot
(bind-keys :map pdf-annot-minor-mode-map
("c" . pdf-tools-extras-copy-dwim)
("x" . pdf-tools-extras-count-words)
("e" . pdf-tools-extras-open-in-ebib)
("h" . pdf-annot-extras-add-highlight-markup-annotation)
("t" . pdf-tools-extras-toggle-writeroom)
("x" . pdf-tools-extras-open-externally)))
(with-eval-after-load 'pdf-history
(bind-keys
:map pdf-history-minor-mode-map
("c" . pdf-tools-extras-copy-dwim)
("x" . pdf-tools-extras-count-words)
("e" . pdf-tools-extras-open-in-ebib)
("h" . pdf-annot-extras-add-highlight-markup-annotation)
("t" . pdf-tools-extras-toggle-writeroom)
("x" . pdf-tools-extras-open-externally)))
:hook
(pdf-tools-enabled-hook . pdf-tools-extras-apply-theme)
(pdf-view-mode-hook . pdf-tools-extras-sel-mode)
:bind (:map pdf-view-mode-map
("c" . pdf-tools-extras-copy-dwim)
("x" . pdf-tools-extras-count-words)
("e" . pdf-tools-extras-open-in-ebib)
("h" . pdf-annot-extras-add-highlight-markup-annotation)
("t" . pdf-tools-extras-toggle-writeroom)
("x" . pdf-tools-extras-open-externally)))
```
`pdf-tools-pages` {#pdf-tools-pages}
_[pdf-tools-pages](https://github.com/benthamite/pdf-tools-pages) is a simple `pdf-tools` extension I created to delete and extract pages from PDF files._
```emacs-lisp
(use-package pdf-tools-pages
:ensure (:host github
:repo "benthamite/pdf-tools-pages")
:after pdf-tools
:init
(with-eval-after-load 'pdf-annot
(bind-keys :map pdf-annot-minor-mode-map
("C" . pdf-tools-pages-clear-page-selection)
("D" . pdf-tools-pages-delete-selected-pages)
("S" . pdf-tools-pages-select-dwim)
("X" . pdf-tools-pages-extract-selected-pages)))
(with-eval-after-load 'pdf-history
(bind-keys
:map pdf-history-minor-mode-map
("C" . pdf-tools-pages-clear-page-selection)
("D" . pdf-tools-pages-delete-selected-pages)
("S" . pdf-tools-pages-select-dwim)
("X" . pdf-tools-pages-extract-selected-pages)))
:bind (:map pdf-view-mode-map
("C" . pdf-tools-pages-clear-page-selection)
("D" . pdf-tools-pages-delete-selected-pages)
("S" . pdf-tools-pages-select-dwim)
("X" . pdf-tools-pages-extract-selected-pages)))
```
`scroll-other-window` {#scroll-other-window}
_[scroll-other-window](https://github.com/benthamite/scroll-other-window) enables scrolling of the other window in `pdf-tools`._
```emacs-lisp
(use-package scroll-other-window
:ensure (:host github
:repo "benthamite/scroll-other-window")
:after pdf-tools
:hook
(pdf-tools-enabled-hook . sow-mode)
:bind
(:map sow-mode-map
("A-C-s-g" . sow-scroll-other-window-down)
("A-C-s-t" . sow-scroll-other-window)))
```
`pdf-view-restore` {#pdf-view-restore}
_[pdf-view-restore](https://github.com/007kevin/pdf-view-restore) adds support to saving and reopening the last known PDF position._
```emacs-lisp
(use-package pdf-view-restore
:after pdf-tools
:init
;; https://github.com/007kevin/pdf-view-restore/issues/6
(defun pdf-view-restore-mode-conditionally ()
"Enable `pdf-view-restore-mode' iff the current buffer is visiting a PDF."
(when (buffer-file-name)
(pdf-view-restore-mode)))
:hook
(pdf-view-mode-hook . pdf-view-restore-mode-conditionally))
```
`moon-reader` {#moon-reader}
_[moon-reader](https://github.com/benthamite/moon-reader) synchronizes page position between pdf-tools and Moon+ Reader._
```emacs-lisp
(use-package moon-reader
:ensure (:host github
:repo "benthamite/moon-reader")
:after pdf-view-restore
:custom
(moon-reader-cache-directory (file-name-concat paths-dir-google-drive "Apps/Books/.Moon+/Cache/")))
```
`org-pdftools` {#org-pdftools}
_[org-pdftools](https://github.com/fuxialexander/org-pdftools) adds org link support for pdf-tools._
```emacs-lisp
(use-package org-pdftools
:ensure (:build (:not elpaca-check-version))
:after org pdf-tools
:hook
(org-mode-hook . org-pdftools-setup-link))
```
`nov` {#nov}
_[nov](https://depp.brause.cc/nov.el/) is a major mode for reading EPUBs in Emacs._
```emacs-lisp
(use-package nov
:defer t)
```
`djvu` {#djvu}
_[djvu](https://elpa.gnu.org/packages/djvu.html) is a major mode for viewing and editing Djvu files in Emacs._
```emacs-lisp
(use-package djvu
:defer t)
```
programming {#programming}
`prog-mode` {#prog-mode}
_[prog-mode](https://github.com/emacs-mirror/emacs/blob/master/lisp/progmodes/prog-mode.el) is the base major mode for programming language modes._
```emacs-lisp
(use-feature prog-mode
:init
(with-eval-after-load 'shell
(bind-keys :map shell-mode-map
("s-c" . exit-recursive-edit)))
:config
(global-prettify-symbols-mode)
:bind
(("A-H-v" . set-variable)
("M-d" . toggle-debug-on-error)
("A-M-d" . toggle-debug-on-quit)
:map prog-mode-map
("A-C-H-i" . mark-defun)
("s-e" . xref-find-definitions)
("s-f" . consult-flycheck)
("M-q" . nil)
("s-q" . prog-fill-reindent-defun)
:map emacs-lisp-mode-map
("s-c" . exit-recursive-edit)))
```
`treesit` {#treesit}
_treesit provides Tree-sitter-based syntax highlighting in Emacs._
```emacs-lisp
(use-feature treesit
:defer 30
:config
(setq treesit-language-source-alist
'((typescript "https://github.com/tree-sitter/tree-sitter-typescript" "master" "typescript/src")
(tsx "https://github.com/tree-sitter/tree-sitter-typescript" "master" "tsx/src")))
(unless (treesit-language-available-p 'typescript)
(treesit-install-language-grammar 'typescript))
(unless (treesit-language-available-p 'tsx)
(treesit-install-language-grammar 'tsx)))
```
`elisp-mode` {#elisp-mode}
_[elisp-mode](https://github.com/emacs-mirror/emacs/blob/master/lisp/progmodes/elisp-mode.el) is the major mode for editing Emacs Lisp._
```emacs-lisp
(use-feature elisp-mode
:init
(defun instrument-defun ()
"Instrument the current defun."
(interactive)
(eval-defun t))
:bind (:map emacs-lisp-mode-map
("s-b" . eval-buffer)
("s-d" . eval-defun)
("s-i" . instrument-defun)
:map lisp-interaction-mode-map
("s-b" . eval-buffer)
("s-d" . eval-defun)
("s-i" . instrument-defun)))
```
`lisp-mode` {#lisp-mode}
_[lisp-mode](https://github.com/emacs-mirror/emacs/blob/master/lisp/emacs-lisp/lisp-mode.el) is the major mode for editing Lisp code._
```emacs-lisp
(use-feature lisp-mode
:custom
;; default is 65, which overrides the value of `fill-column'
(emacs-lisp-docstring-fill-column nil))
```
`curl-to-elisp` {#curl-to-elisp}
_[curl-to-elisp](https://github.com/xuchunyang/curl-to-elisp) converts cURL command to Emacs Lisp code._
```emacs-lisp
(use-package curl-to-elisp
:defer t)
```
`f` {#f}
_[f](https://github.com/rejeep/f.el) is a modern API for working with files and directories in Emacs._
```emacs-lisp
(use-package f
:defer t)
```
`s` {#s}
_[s](https://github.com/magnars/s.el) is a string manipulation library._
```emacs-lisp
(use-package s
:defer t)
```
`backtrace` {#backtrace}
_[backtrace](https://github.com/emacs-mirror/emacs/blob/master/lisp/emacs-lisp/backtrace.el) provides generic facilities for displaying backtraces._
```emacs-lisp
(use-feature backtrace
:defer t
:custom
(backtrace-line-length nil))
```
`debug` {#debug}
_[debug](https://github.com/emacs-mirror/emacs/blob/master/lisp/emacs-lisp/debug.el) is the Emacs Lisp debugger._
```emacs-lisp
(use-feature debug
:config
(defun debug-save-backtrace ()
"Save the backtrace at point and copy its path to the kill-ring."
(interactive)
(when (string-match-p "\\*Backtrace\\*" (buffer-name))
(let* ((contents (buffer-string))
(file (file-name-concat paths-dir-downloads "backtrace.el"))
message)
(with-temp-buffer
(insert contents)
(write-region (point-min) (point-max) file)
(setq message (format "Backtrace saved to \"%s\" (%s)"
(abbreviate-file-name file)
(file-size-human-readable (file-attribute-size (file-attributes file)))))
(kill-new file))
(kill-buffer)
(message "%s" message))))
:bind
(:map debugger-mode-map
("s" . debug-save-backtrace)))
```
`edebug` {#edebug}
_[edebug](https://github.com/emacs-mirror/emacs/blob/master/lisp/emacs-lisp/edebug.el) is a source-level debugger for Emacs Lisp._
```emacs-lisp
(use-feature edebug
:custom
(edebug-sit-for-seconds 10)
(edebug-sit-on-break nil)
;; do not truncate print results
(print-level nil)
(print-length nil)
(print-circle nil)
(edebug-print-level nil)
(edebug-print-length nil)
(edebug-print-circle nil) ; disable confusing #N= and #N# print syntax
:bind
(:map emacs-lisp-mode-map
("M-s-d" . edebug-defun)))
```
`macrostep` {#macrostep}
_[macrostep](https://github.com/joddie/macrostep) is an interactive macro-expander._
See [this video](https://www.youtube.com/watch?v=odkYXXYOxpo) (starting at 7:30) for an introduction to this package.
```emacs-lisp
(use-package macrostep
:defer t)
```
`js` {#js}
_[js](https://github.com/emacs-mirror/emacs/blob/master/lisp/progmodes/js.el) is a major mode for editing JavaScript._
```emacs-lisp
(use-feature js
:custom
(js-indent-level 4)
:bind
(:map js-mode-map
("s-w" . eww-extras-browse-file)
("M-," . window-extras-buffer-move-left)
("M-." . window-extras-buffer-move-right)))
```
`js2-mode` {#js2-mode}
_[js2-mode](https://github.com/mooz/js2-mode) is a Javascript editing mode for Emacs._
```emacs-lisp
(use-package js2-mode
:defer t)
```
`clojure` {#clojure}
_[clojure-mode](https://github.com/clojure-emacs/clojure-mode) provides support for the Clojure(Script) programming language._
```emacs-lisp
(use-package clojure-mode
:defer t)
```
`haskell-mode` {#haskell-mode}
_[haskell-mode](https://github.com/haskell/haskell-mode) is a major mode for Haskell._
```emacs-lisp
(use-package haskell-mode
:defer t)
```
`python` {#python}
_[python](https://github.com/emacs-mirror/emacs/blob/master/lisp/progmodes/python.el) is the major mode for editing Python._
```emacs-lisp
(use-feature python
:custom
(python-shell-interpreter "/Users/pablostafforini/.pyenv/shims/python3")
(org-babel-python-command "/Users/pablostafforini/.pyenv/shims/python3")
(flycheck-python-pycompile-executable "/Users/pablostafforini/.pyenv/shims/python3")
(python-indent-offset 4) ; Set default to suppress warning message
(python-indent-guess-indent-offset nil) ; Don't try to guess indent
:config
(org-babel-do-load-languages
'org-babel-load-languages
'((python . t)))
:bind (:map python-mode-map
("C-c p" . run-python)
("s-l" . python-shell-send-file)
("s-d" . python-shell-send-defun)
("s-c" . python-shell-send-buffer)
("s-s" . python-shell-send-string)
("s-r" . python-shell-send-region)
("s-e" . python-shell-send-statement)
:map python-ts-mode-map
("C-c p" . run-python)
("s-l" . python-shell-send-file)
("s-d" . python-shell-send-defun)
("s-c" . python-shell-send-buffer)
("s-s" . python-shell-send-string)
("s-r" . python-shell-send-region)
("s-e" . python-shell-send-statement)))
```
`pyenv-mode` {#pyenv-mode}
_[pyenv-mode](https://github.com/pythonic-emacs/pyenv-mode) integrates pyenv with python-mode._
```emacs-lisp
(use-package pyenv-mode
:after python
:init
(add-to-list 'exec-path "~/.pyenv/bin"))
```
`pet` {#pet}
_[pet](https://github.com/wyuenho/emacs-pet) tracks down the correct Python tooling executables from Python virtual environments._
```emacs-lisp
(use-package pet
:after python
:defer t
;; Disabled by default because it causes slowdown when opening Python files
;; Uncomment the line below to enable pet-mode automatically
;; :config
;; (add-hook 'python-base-mode-hook 'pet-mode -10)
)
```
`emacs-ipython-notebook` {#emacs-ipython-notebook}
_[emacs-ipython-notebook](https://github.com/millejoh/emacs-ipython-notebook) is a Jupyter notebook client in Emacs._
This needs to be configured.
```emacs-lisp
(use-package ein
:after python
:defer t)
```
`go` {#go}
_[go-mode](https://github.com/dominikh/go-mode.el) provides support for the Go programming language._
```emacs-lisp
(use-package go-mode
:defer t)
```
`applescript-mode` {#applescript-mode}
_[applescript-mode](https://github.com/emacsorphanage/applescript-mode) is a major mode for editing AppleScript._
```emacs-lisp
(use-package applescript-mode
:defer t)
```
`json-mode` {#json-mode}
_[json-mode](https://github.com/json-emacs/json-mode) is a major mode for editing JSON files._
```emacs-lisp
(use-package json-mode
:defer t
:bind
(:map json-mode-map
("RET" . nil)))
```
`csv-mode` {#csv-mode}
_[csv-mode](https://elpa.gnu.org/packages/csv-mode.html) is a major mode for editing comma-separated values._
```emacs-lisp
(use-package csv-mode
:defer t)
```
`yaml` {#yaml}
_[yaml](https://github.com/zkry/yaml.el) is YAML parser written in Emacs List without any external dependencies._
```emacs-lisp
(use-package yaml
:ensure (:host github
:repo "zkry/yaml.el")
:defer t)
```
`yaml-mode` {#yaml-mode}
_[yaml-mode](https://github.com/yoshiki/yaml-mode) is a major mode for editing YAML files._
```emacs-lisp
(use-package yaml-mode)
```
`shut-up` {#shut-up}
_[shut-up](https://github.com/cask/shut-up/) provides a macro to silence function calls._
```emacs-lisp
(use-package shut-up)
```
`puni` {#puni}
_[puni](https://github.com/AmaiKinono/puni) supports structured editing for many major modes._
```emacs-lisp
(use-package puni
:bind
(:map prog-mode-map
("C-H-M-r" . puni-forward-kill-word)
("C-H-M-q" . puni-backward-kill-word)
("C-H-M-v" . puni-kill-line)
("C-H-M-z" . puni-backward-kill-line)
("A-C-s-d" . puni-forward-sexp)
("A-C-s-e" . puni-backward-sexp)
("A-C-s-r" . puni-beginning-of-sexp)
("A-C-s-f" . puni-end-of-sexp)
("A-C-H-j" . puni-mark-sexp-at-point)
("A-C-H-k" . puni-mark-sexp-around-point)))
```
`hl-todo` {#hl-todo}
_[hl-todo](https://github.com/tarsius/hl-todo) highlights TODO and similar keywords in comments and strings._
```emacs-lisp
(use-package hl-todo
:ensure (:build (:not elpaca-check-version))
:defer 30
:config
(setopt hl-todo-keyword-faces
(append '(("WAITING" . "blue")
("LATER" . "violet")
("SOMEDAY" . "brown")
("DELEGATED" . "gray"))))
(global-hl-todo-mode))
```
`consult-todo` {#consult-todo}
_[consult-todo](https://github.com/eki3z/consult-todo) uses `consult` to navigate `hl-todo` keywords._
```emacs-lisp
(use-package consult-todo
:after hl-todo
:bind
(:map prog-mode-map
("s-t" . consult-todo)))
```
`project` {#project}
_[project](https://github.com/emacs-mirror/emacs/blob/master/lisp/progmodes/project.el) provides various functions for dealing with projects._
```emacs-lisp
(use-feature project
:bind
(:map emacs-lisp-mode-map
("s-r" . project-query-replace-regexp)))
```
`hideshow` {#hideshow}
_hideshow is a minor mode for block hiding and showing._
```emacs-lisp
(use-feature hideshow
:hook
(prog-mode-hook . hs-minor-mode))
```
`aggressive-indent` {#aggressive-indent}
_[aggressive-indent](https://github.com/Malabarba/aggressive-indent-mode) keeps code always indented._
```emacs-lisp
(use-package aggressive-indent
:config
(global-aggressive-indent-mode 1)
(add-to-list 'aggressive-indent-excluded-modes 'snippet-mode))
```
`elpy` {#elpy}
_[elpy](https://github.com/jorgenschaefer/elpy) is an Emacs Python development environment._
```emacs-lisp
(use-package elpy
:defer t
:custom
(elpy-rpc-python-command "python3")
(elpy-rpc-virtualenv-path 'current)
;; Disabled by default because it causes slowdown when opening Python files
;; To enable elpy features, run M-x elpy-enable or uncomment the lines below
;; :config
;; (elpy-enable)
)
```
`eldoc` {#eldoc}
_[eldoc](https://elpa.gnu.org/packages/eldoc.html) show function arglist or variable docstring in echo area._
```emacs-lisp
(use-feature eldoc
:config
;; emacs.stackexchange.com/a/55914/32089
(define-advice elisp-get-fnsym-args-string (:around (orig-fun sym &rest r) docstring)
"If SYM is a function, append its docstring."
(concat
(apply orig-fun sym r)
(let* ((doc (and (fboundp sym) (documentation sym 'raw)))
(oneline (and doc (substring doc 0 (string-match "\n" doc)))))
(and oneline
(stringp oneline)
(not (string= "" oneline))
(concat " | " (propertize oneline 'face 'italic))))))
;; reddit.com/r/emacs/comments/1l9y7de/showing_org_mode_link_at_point_in_echo_area/
;; this is natively supported in Emacs 31
(defun org-display-link-info-at-point ()
"Display the link info in the echo area when the cursor is on an Org mode link."
(when-let* ((is-face-at-point 'org-link)
(link-info (get-text-property (point) 'help-echo)))
;; This will show the link in the echo area without it being logged in
;; the Messages buffer.
(let ((message-log-max nil)) (message "%s" link-info))))
(dolist (h '(org-mode-hook org-agenda-mode-hook))
(add-hook h (lambda () (add-hook 'post-command-hook #'org-display-link-info-at-point nil 'local))))
(defun is-face-at-point (face)
"Returns non-nil if given FACE is applied at text at the current point."
(let ((face-at-point (get-text-property (point) 'face)))
(or (eq face-at-point face) (and (listp face-at-point) (memq face face-at-point)))))
(global-eldoc-mode))
```
org-mode {#org-mode}
`org` {#org}
_[org-mode](https://orgmode.org/) is a major mode for keeping notes, authoring documents, computational notebooks, literate programming, maintaining to-do lists, planning projects, and more._
```emacs-lisp
(use-feature org
:custom
(org-directory paths-dir-org) ; set org directory
(org-todo-keywords
'((sequence "TODO(t)"
"DOING(g)"
"IMPORTANT(i)"
"URGENT(u)"
"SOMEDAY(s)"
"MAYBE(m)"
"WAITING(w)"
"PROJECT(p)"
"NEXT(n)"
"LATER(l)"
"|"
"DELEGATED(e)"
"DONE(d)"
"CANCELLED(c)")))
(org-priority-highest 1)
(org-priority-default 7)
(org-priority-lowest 9) ; set priorities
(org-deadline-warning-days 0) ; show due tasks only on the day the tasks are due
(org-hide-emphasis-markers t)
(org-hide-leading-stars t) ; indent every heading and hide all but the last leading star
(org-startup-indented t)
(org-log-into-drawer "STATES")
(org-log-done 'time) ; add timestamp when task is marked as DONE
(org-log-repeat nil) ; do not log TODO status changes for repeating tasks
(org-M-RET-may-split-line nil) ; irreal.org/blog/?p=6297
(org-loop-over-headlines-in-active-region t) ; Allow simultaneous modification of multiple task statuses.
(org-ctrl-k-protect-subtree t)
(org-special-ctrl-a/e t) ; `org-beginning-of-line' goes to beginning of first word
(org-mark-ring-length 4)
(org-pretty-entities nil)
(org-image-actual-width '(800))
(org-link-elisp-confirm-function nil)
(org-file-apps '((auto-mode . emacs)
(directory . emacs)
("\\.mm\\'" . default)
("\\.x?html?\\'" . default)
("\\.pdf\\'" . emacs)))
(org-use-tag-inheritance t)
(org-yank-dnd-method 'attach)
(org-yank-image-save-method paths-dir-org-images)
(org-structure-template-alist
'(("a" . "export ascii")
("c" . "center")
("C" . "comment")
("e" . "example")
("E" . "export")
("h" . "export html")
("l" . "export latex")
("q" . "quote")
("s" . "src")
("se" . "src emacs-lisp")
("sE" . "src emacs-lisp :tangle (init-tangle-conditionally)")
("sc" . "src clojure")
("sj" . "src javascript")
("sm" . "src markdown")
("sp" . "src python")
("sq" . "src sql")
("ss" . "src shell")
("v" . "verse")
("w" . "WP")))
;; refile
(org-reverse-note-order t) ; store notes at the beginning of header
;; export
(org-export-backends '(ascii html icalendar latex md odt texinfo)) ; set export backends
(org-preview-latex-default-process 'dvisvgm)
;; org-src
(org-src-fontify-natively t)
;; org-crypt
(org-tags-exclude-from-inheritance '("crypt"))
:config
(plist-put org-format-latex-options :scale 2)
(plist-put org-format-latex-options :background "Transparent")
(dolist (module '(org-habit org-tempo))
(add-to-list 'org-modules module))
;; force reloading of first file opened so the buffer is correctly formatted
(with-eval-after-load 'org
(when (and (buffer-file-name)
(string-match "\\.org$" (buffer-file-name)))
(revert-buffer nil t)))
:bind
(:map org-mode-map
("C-H-M-s-z" . org-shiftleft)
("C-H-M-s-x" . org-shiftup)
("C-H-M-s-c" . org-shiftdown)
("C-H-M-s-v" . org-shiftright)
("C-H-M-s-a" . org-metaleft)
("C-H-M-s-s" . org-metaup)
("C-H-M-s-d" . org-metadown)
("C-H-M-s-f" . org-metaright)
("C-H-M-s-q" . org-shiftmetaleft)
("C-H-M-s-w" . org-shiftmetaup)
("C-H-M-s-e" . org-shiftmetadown)
("C-H-M-s-r" . org-shiftmetaright)
("s-j" . consult-extras-org-heading)
("s-A-k" . org-web-tools-insert-link-for-url)
("s-l" . org-transclusion-add-all)
("s-c" . ox-clip-formatted-copy)
("s-w" . org-refile)
("s-A-i" . org-id-copy)
("<S-left>" . nil)
("<S-right>" . nil)
("<S-up>" . nil)
("<S-down>" . nil)
("<M-left>" . nil)
("<M-right>" . nil)
("<M-S-left>" . nil)
("<M-S-right>" . nil)
("<M-up>" . nil)
("<M-down>" . nil)
("C-j" . nil)
("<backtab>" . org-shifttab)
("C-k" . nil)
("C-," . nil)
("A-C-s-i" . org-backward-sentence)
("A-C-s-o" . org-forward-sentence)
("A-C-s-," . org-backward-paragraph)
("A-C-s-." . org-forward-paragraph) ; org element?
("A-C-s-m" . org-beginning-of-line)
("A-C-s-z" . org-end-of-line) ; karabiner maps `/' to `z'; otherwise I can't trigger the command while holding `shift'
("A-C-s-r" . org-previous-visible-heading)
("A-C-s-f" . org-next-visible-heading)
("A-C-s-M-m" . org-previous-block)
("A-C-s-M-/" . org-next-block)
("A-C-H-t" . org-extras-copy-dwim)
("A-C-M-s-j" . org-previous-link)
("A-C-M-s-;" . org-next-link)
("A-H-M-t" . org-transpose-element)
("s-d" . org-deadline)
("s-e" . org-set-effort)
("s-f" . org-footnote-action)
("s-h" . org-insert-todo-subheading)
("s-p" . org-time-stamp-inactive)
("s-A-p" . org-time-stamp)
("s-g" . org-agenda)
("A-s-g" . org-gcal-extras-menu)
("s-k" . org-insert-link)
("s-q" . org-set-tags-command)
("s-r" . org-roam-buffer-toggle)
("s-s" . org-schedule)
("s-t" . org-todo)
("s-A-t" . org-sort)
("s-u" . org-clock-split)
("s-y" . org-evaluate-time-range)
("s-z" . org-edit-special)
("s-," . org-priority)
("s-A-e" . org-export-dispatch)
("A-<return>" . "C-u M-<return>")
("A-M-<return>" . org-insert-todo-heading)
;; bindings with matching commands in Fundamental mode
("H-v" . org-yank)
("M-f" . ace-link-org)))
```
`org-extras` {#org-extras}
_[org-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-extras.el) collects my extensions for `org`._
```emacs-lisp
(use-personal-package org-extras
:init
(setq org-extras-agenda-switch-to-agenda-current-day-timer
(run-with-idle-timer (* 10 60) nil #'org-extras-agenda-switch-to-agenda-current-day))
:custom
(org-extras-id-auto-add-excluded-directories (list paths-dir-anki
paths-dir-dropbox-tlon-leo
paths-dir-dropbox-tlon-fede
paths-dir-android
(file-name-concat paths-dir-dropbox-tlon-leo "gptel/")
(file-name-concat paths-dir-dropbox-tlon-fede "archive/")
(file-name-concat paths-dir-dropbox-tlon-core "legal/contracts/")))
(org-extras-agenda-files-excluded (list paths-file-tlon-tareas-leo
paths-file-tlon-tareas-fede))
(org-extras-clock-in-add-participants-exclude
"Leo<>Pablo\\|Fede<>Pablo\\|Tlön: group meeting")
:config
(setopt org-extras-agenda-files-excluded
(append org-extras-agenda-files-excluded
;; files in `paths-dir-inactive' sans ., .., hidden files and subdirectories
(seq-filter (lambda (f) (not (file-directory-p f)))
(directory-files paths-dir-inactive t "^[^.][^/]*$"))))
(quote (:link t :maxlevel 5 :fileskip0 t :narrow 70 :formula % :indent t :formatter org-extras-clocktable-sorter))
:hook
(org-mode-hook . org-extras-enable-nested-src-block-fontification)
(before-save-hook . org-extras-id-auto-add-ids-to-headings-in-file)
:bind
(("H-;" . org-extras-personal-menu)
("A-H-w" . org-extras-refile-goto-latest)
:map org-mode-map
("s-<return>" . org-extras-super-return)
("s-v" . org-extras-paste-with-conversion)
("A-C-s-n" . org-extras-jump-to-first-heading)
("A-s-b" . org-extras-set-todo-properties)
("A-s-h" . org-extras-insert-todo-subheading-after-body)
("A-s-v" . org-extras-paste-image)
("A-s-z" . org-extras-export-to-ea-wiki)
("M-w" . org-extras-count-words)
("A-s-n" . org-extras-new-clock-entry-today)
("s-." . org-extras-time-stamp-active-current-time)
("A-s-." . org-extras-time-stamp-active-current-date)
("s-/" . org-extras-time-stamp-inactive-current-time)
("A-s-/" . org-extras-time-stamp-inactive-current-date)
("A-s-u" . org-extras-id-update-id-locations)
("A-s-c" . org-extras-mark-checkbox-complete-and-move-to-next-item)
("A-s-o" . org-extras-reset-checkbox-state-subtree)
("H-s-w" . org-extras-refile-and-archive)
("s-A-l" . org-extras-url-dwim)))
```
`org-agenda` {#org-agenda}
_[org-agenda](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-agenda.el) provides agenda views for org tasks and appointments._
```emacs-lisp
(use-feature org-agenda
:after org
:init
(setopt org-agenda-hide-tags-regexp "project")
:custom
(org-agenda-window-setup 'current-window)
(org-agenda-use-time-grid nil)
(org-agenda-ignore-properties '(effort appt category))
(org-agenda-dim-blocked-tasks nil)
(org-agenda-sticky t)
(org-agenda-todo-ignore-with-date t) ; exclude tasks with a date.
(org-agenda-todo-ignore-scheduled 'future) ; exclude scheduled tasks.
(org-agenda-restore-windows-after-quit t) ; don't destroy window splits
(org-agenda-span 1) ; show daily view by default
(org-agenda-clock-consistency-checks ; highlight gaps of five or more minutes in agenda log mode
'(:max-duration "5:00" :min-duration "0:01" :max-gap 5 :gap-ok-around ("2:00")))
(org-agenda-skip-scheduled-if-done t)
(org-agenda-skip-deadline-if-done t)
(org-agenda-log-mode-items '(clock))
(org-agenda-custom-commands
'(("E" "TODOs without effort"
((org-ql-block '(and (todo)
(not (property "effort")))
((org-ql-block-header "TODOs without effort")))))
("w" "Weekly review"
agenda ""
((org-agenda-clockreport-mode t)
(org-agenda-archives-mode t)
(org-agenda-start-day "-7d")
(org-agenda-span 7)
(org-agenda-start-on-weekday 0)))
("p" "Appointments" agenda* "Today's appointments"
((org-agenda-span 1)
(org-agenda-max-entries 3)))
("r"
"Reading list"
tags
"PRIORITY=\"1\"|PRIORITY=\"2\"|PRIORITY=\"3\"|PRIORITY=\"4\"|PRIORITY=\"5\"|PRIORITY=\"6\"|PRIORITY=\"7\"|PRIORITY=\"8\"|PRIORITY=\"9\""
((org-agenda-files (list paths-dir-bibliographic-notes))))
("g" "All TODOs"
todo "TODO")
("G" "All Tlön TODOs"
todo "TODO"
((org-agenda-files (list paths-dir-tlon-todos))))
("," "All tasks with no priority"
tags-todo "-PRIORITY=\"1\"-PRIORITY=\"2\"-PRIORITY=\"3\"-PRIORITY=\"4\"-PRIORITY=\"5\"-PRIORITY=\"6\"-PRIORITY=\"7\"-PRIORITY=\"8\"-PRIORITY=\"9\"")))
(org-agenda-files (list paths-file-calendar))
(org-agenda-archives-mode 'trees)
:config
(advice-add 'org-agenda-goto :after
(lambda (&rest _)
"Narrow to the entry and its children after jumping to it."
(org-extras-narrow-to-entry-and-children)))
(advice-add 'org-agenda-prepare-buffers :around
(lambda (fn files)
"Skip agenda files that can't be opened (e.g. Dropbox online-only placeholders).
`org-agenda-prepare-buffers' passes each file to `org-get-agenda-file-buffer'
and hands the result to `with-current-buffer'. When a file cannot be read
\(e.g. a Dropbox online-only placeholder), the call errors out. Pre-filter
the file list so that only openable files reach the original function."
(let (readable)
(dolist (f files)
(if (bufferp f)
(push f readable)
(condition-case err
(when (org-get-agenda-file-buffer f)
(push f readable))
(error
(message "org-agenda: skipping unreadable file %s: %s"
f (error-message-string err))))))
(funcall fn (nreverse readable)))))
(advice-add 'org-habit-toggle-display-in-agenda :around
(lambda (orig-fun &rest args)
"Prevent `org-modern-mode' interference with org habits."
(if org-habit-show-habits
(progn
(global-org-modern-mode)
(apply orig-fun args)
(org-agenda-redo)
(global-org-modern-mode))
(global-org-modern-mode -1)
(apply orig-fun args)
(org-agenda-redo)
(global-org-modern-mode -1))))
:hook
(org-agenda-mode-hook . (lambda ()
"Disable `visual-line-mode' and `toggle-truncate-lines' in `org-agenda'."
(visual-line-mode -1)
(toggle-truncate-lines)))
:bind
(("s-g" . org-agenda)
:map org-agenda-mode-map
("A-s-g" . org-gcal-extras-menu)
("s-s" . org-save-all-org-buffers)
("I" . org-pomodoro)
("h" . org-habit-toggle-display-in-agenda)
("M-k" . org-clock-convenience-timestamp-up)
("M-l" . org-clock-convenience-timestamp-down)
("s-b" . calendar-extras-calfw-block-agenda)
("f" . ace-link-org-agenda)
("?" . org-agenda-filter)
(";" . org-agenda-later)
("C-b" . org-agenda-tree-to-indirect-buffer)
("C-k" . nil)
("d" . org-agenda-deadline)
("M-t" . nil)
("H-n" . nil)
("s-k" . nil)
("s-f" . ace-link-extras-org-agenda-clock-in)
("i" . org-agenda-clock-in)
("I" . org-agenda-diary-entry)
("j" . org-agenda-earlier)
("J" . org-agenda-goto-date)
("k" . org-agenda-previous-line)
("l" . org-agenda-next-line)
("n" . org-agenda-date-later)
("o" . org-agenda-open-link)
("p" . org-agenda-date-earlier)
("q" . org-agenda-kill-all-agenda-buffers)
("RET" . org-extras-agenda-switch-to-dwim)
("/" . org-extras-agenda-done-and-next)
("\"" . org-extras-agenda-postpone-and-next)
("b" . org-extras-agenda-toggle-anniversaries)
("SPC" . org-extras-agenda-goto-and-start-clock)
("x" . org-extras-agenda-toggle-log-mode)
("s" . org-agenda-schedule)
("w" . org-agenda-refile)
("W" . org-agenda-week-view)
("X" . org-agenda-exit)
("y" . org-agenda-day-view)
("z" . org-agenda-undo)))
```
`org-capture` {#org-capture}
_[org-capture](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-capture.el) provides rapid note-taking and task capture._
```emacs-lisp
(use-feature org-capture
:custom
(org-default-notes-file paths-file-inbox-desktop)
(org-capture-templates
`(("." "Todo" entry
(id "B3C12507-6A83-42F6-9FFA-9A45F5C8F278")
"** TODO %?\n" :empty-lines 1)
;; djcbsoftware.nl/code/mu/mu4e/Org_002dmode-links.html
("e" "Email" entry
(id "B3C12507-6A83-42F6-9FFA-9A45F5C8F278")
"** TODO Follow up with %:fromname on %a\nSCHEDULED: %t\n\n%i" :immediate-finish t :empty-lines 1 :prepend t)
("n" "Telegram" entry
(id "B3C12507-6A83-42F6-9FFA-9A45F5C8F278")
"** TODO Follow up with %a\nSCHEDULED: %t\n\n%i" :immediate-finish t :empty-lines 1 :prepend t)
("r" "Calendar" entry
(file ,paths-file-calendar)
"* TODO [#5] %^ \nDEADLINE: %^T" :empty-lines 1 :immediate-finish t)
("s" "Slack" entry
(id "B3C12507-6A83-42F6-9FFA-9A45F5C8F278")
"** TODO Follow up %a\nSCHEDULED: %t\n\n%i" :immediate-finish t :empty-lines 1 :prepend t)
("E" "Epoch inbox" entry
(file paths-file-epoch-inbox)
"** TODO %?\n" :empty-lines 1 :prepend t)
("S" "Slack to Epoch inbox" entry
(file paths-file-epoch-inbox)
"** TODO Follow up %a\nSCHEDULED: %t\n\n%i" :immediate-finish t :empty-lines 1 :prepend t)
("t" "Tlön inbox " entry
(id "E9C77367-DED8-4D59-B08C-E6E1CCDDEC3A")
"** TODO %? \n" :empty-lines 1 :prepend t)
("y" "YouTube playlist" entry
(id "319B1611-A5A6-42C8-923F-884A354333F9")
"* %(org-web-tools-extras-youtube-dl (current-kill 0))\n[[%c][YouTube link]]" :empty-lines 1 :prepend t :immediate-finish t)
;; github.com/alphapapa/org-protocol-capture-html#org-capture-template
("f" "Feed" entry
(id "D70DFEBE-A5FD-4BA6-A054-49E7C8F6448A")
"*** %(org-capture-feed-heading)\n%?" :empty-lines 1)
("w" "Web site" entry
(file paths-file-downloads)
"* %a :website:\n\n%U %?\n\n%:initial")))
:config
(defun org-capture-feed-heading ()
"Prompt for feed URL, name, and tags, returning a formatted Org heading."
(let* ((url (read-string "Feed URL: "))
(name (read-string "Name: "))
(id "D70DFEBE-A5FD-4BA6-A054-49E7C8F6448A")
(file (org-id-find-id-file id))
(all-tags (when file
(with-current-buffer (find-file-noselect file)
(mapcar #'car (org-get-buffer-tags)))))
(selected (completing-read-multiple "Tags: " all-tags))
(tag-str (if selected
(concat " :" (string-join selected ":") ":")
"")))
(format "[[%s][%s]]%s" url name tag-str)))
(defun org-capture-feed-sort ()
"Sort feed entries under the Feeds heading alphabetically after capture."
(when (string= (plist-get org-capture-plist :key) "f")
(let ((marker (org-id-find "D70DFEBE-A5FD-4BA6-A054-49E7C8F6448A" 'marker)))
(when marker
(with-current-buffer (marker-buffer marker)
(save-excursion
(goto-char marker)
(org-sort-entries nil ?a))
(save-buffer))
(set-marker marker nil)))))
:hook
(org-capture-before-finalize-hook . org-extras-capture-before-finalize-hook-function)
(org-capture-after-finalize-hook . org-capture-feed-sort)
:bind
(("H-t" . org-capture)
("A-H-t" . org-capture-goto-last-stored)
:map org-capture-mode-map
("s-c" . org-capture-finalize)
("s-w" . org-capture-refile)))
```
`org-clock` {#org-clock}
_[org-clock](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-clock.el) implements clocking time spent on tasks._
```emacs-lisp
(use-feature org-clock
:after org
:custom
(org-clock-out-when-done t)
(org-clock-persist t)
(org-clock-persist-query-resume nil)
(org-clock-in-resume t)
(org-clock-report-include-clocking-task t)
(org-clock-ask-before-exiting nil)
(org-clock-history-length 30)
(org-clock-into-drawer "LOGBOOK") ; file task state changes in STATES drawer
:config
(org-clock-persistence-insinuate)
:bind
(("A-H-j" . org-clock-goto)
("A-H-x" . org-clock-cancel)
("H-i" . org-clock-in)
("H-o" . org-clock-out)))
```
`org-clock-convenience` {#org-clock-convenience}
_[org-clock-convenience](https://github.com/dfeich/org-clock-convenience) provides convenience functions to work with org-mode clocking._
```emacs-lisp
(use-package org-clock-convenience
:after org-clock org-agenda
:defer t)
```
`org-clock-split` {#org-clock-split}
_[org-clock-split](https://github.com/0robustus1/org-clock-split) allows splitting of one clock entry into two contiguous entries._
I’m using a fork that fixes some functionality that broke when org changed `org-time-stamp-formats`.
```emacs-lisp
(use-package org-clock-split
:ensure (:host github
:repo "0robustus1/org-clock-split"
:branch "support-emacs-29.1")
:defer t)
```
`org-cycle` {#org-cycle}
_[org-cycle](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-cycle.el) controls visibility cycling of org outlines._
```emacs-lisp
(use-feature org-cycle
:after org
:custom
(org-cycle-emulate-tab nil)) ; TAB always cycles, even if point not on a heading
```
`org-archive` {#org-archive}
_[org-archive](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-archive.el) archives org subtrees._
```emacs-lisp
(use-feature org-archive
:after org
:custom
(org-archive-default-command 'org-archive-to-archive-sibling)
(org-archive-location (expand-file-name "%s_archive.org::" paths-dir-archive))
:bind
(:map org-mode-map
("s-a" . org-archive-subtree-default)))
```
`org-archive-hierarchically` {#org-archive-hierarchically}
_[org-archive-hierarchically](https://gitlab.com/andersjohansson/org-archive-hierarchically) archives org subtrees in a way that preserves the original heading structure._
I normally archive subtrees with `org-archive-to-archive-sibling`, but use `org-archive-hierarchically` for files in public repositories (such as this one). `org-archive-to-archive-sibling` moves archived tasks to a heading, which is by default collapsed in org, but in Github archived tasks are always fully visible, creating a lot of clutter.
```emacs-lisp
(use-package org-archive-hierarchically
:ensure (:host gitlab
:repo "andersjohansson/org-archive-hierarchically")
:defer t)
```
`org-fold` {#org-fold}
_[org-fold](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-fold.el) manages visibility and folding of org outlines._
```emacs-lisp
(use-feature org-fold
:after org
:custom
(org-fold-catch-invisible-edits 'smart))
```
`org-faces` {#org-faces}
_[org-faces](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-faces.el) defines faces for `org-mode`._
```emacs-lisp
(use-feature org-faces
:after org faces-extras
:custom
(org-fontify-quote-and-verse-blocks t)
:config
(faces-extras-set-and-store-face-attributes
'((org-drawer :family faces-extras-fixed-pitch-font :height faces-extras-org-property-value-height
:foreground "LightSkyBlue")
(org-property-value :family faces-extras-fixed-pitch-font :height faces-extras-org-property-value-height)
(org-special-keyword :family faces-extras-fixed-pitch-font :height faces-extras-org-property-value-height)
(org-meta-line :family faces-extras-fixed-pitch-font :height faces-extras-org-property-value-height)
(org-tag :family faces-extras-fixed-pitch-font :height faces-extras-org-tag-height)
(org-document-title :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-height)
(org-code :family faces-extras-fixed-pitch-font :height faces-extras-org-code-height)
(org-todo :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-archived :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-1 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-2 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-3 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-4 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-5 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-6 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-7 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-level-8 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(org-date :family faces-extras-fixed-pitch-font :height faces-extras-org-date-height)
(org-block :family faces-extras-fixed-pitch-font :height faces-extras-org-block-height)
(org-block-begin-line :family faces-extras-fixed-pitch-font :height faces-extras-org-block-height)
(org-quote :family faces-extras-variable-pitch-font :height 1.0))))
```
`org-id` {#org-id}
_[org-id](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-id.el) manages globally unique IDs for org entries._
```emacs-lisp
(use-feature org-id
:after org
:defer t
:custom
(org-id-link-to-org-use-id t)
;; I want these files to be searched for IDs, so that I can use
;; org-capture templates with them, but do not want them to be part
;; of org-agenda or org-roam.
(org-id-extra-files (list
paths-file-tlon-tareas-leo
paths-file-tlon-tareas-fede))
:config
(defun org-id-ensure-locations-hash-table (&rest _)
"Ensure `org-id-locations' is a hash table.
If `org-id-update-id-locations' is interrupted between building the
alist and converting it to a hash table, `org-id-locations' can be
left as an alist, causing `puthash' errors in `org-id-add-location'."
(when (and org-id-locations (not (hash-table-p org-id-locations)))
(setq org-id-locations (org-id-alist-to-hash org-id-locations))))
(advice-add 'org-id-add-location :before #'org-id-ensure-locations-hash-table))
```
`org-list` {#org-list}
_[org-list](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-list.el) handles plain lists in org-mode._
```emacs-lisp
(use-feature org-list
:after org
:custom
(org-plain-list-ordered-item-terminator ?.)
(org-list-indent-offset 2))
```
`org-refile` {#org-refile}
_org-refile refiles subtrees to various locations._
```emacs-lisp
(use-feature org-refile
:after org
:defer t
:custom
(org-refile-targets '((org-agenda-files :maxlevel . 9)
(files-extras-open-buffer-files :maxlevel . 9)
(nil :maxlevel . 9)))
(org-refile-use-outline-path 'level3)
(org-outline-path-complete-in-steps nil)
(org-refile-allow-creating-parent-nodes nil)
(org-refile-use-cache t) ; build cache at startup
:config
;; Regenerate cache every half hour
(run-with-idle-timer (* 60 30) t #'org-extras-refile-regenerate-cache))
```
`org-keys` {#org-keys}
_[org-keys](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-keys.el) manages speed keys and key bindings for org-mode._
Enable speed keys. To trigger a speed key, point must be at the very beginning of an org headline. Type '?' for a list of keys.
```emacs-lisp
(use-feature org-keys
:after org
:custom
(org-return-follows-link t)
(org-use-speed-commands t)
(org-speed-commands
'(("Outline navigation")
("k" . (org-speed-move-safe 'org-previous-visible-heading))
("." . (org-speed-move-safe 'org-forward-heading-same-level))
("," . (org-speed-move-safe 'org-backward-heading-same-level))
("l" . (org-speed-move-safe 'org-next-visible-heading))
("m" . (org-speed-move-safe 'outline-up-heading))
("j" . (consult-extras-org-heading))
("Outline structure editing")
("a" . (org-metaleft))
("d" . (org-metadown))
("s" . (org-metaup))
("f" . (org-metaright))
("q" . (org-shiftmetaleft))
("e" . (org-shiftmetadown))
("w" . (org-shiftmetaup))
("r" . (org-shiftmetaright))
("Archiving")
("A" . (org-archive-subtree-default))
("'" . (org-force-cycle-archived))
("Meta data editing")
("t" . (org-todo))
("Clock")
("h" . (org-extras-jump-to-latest-clock-entry))
("H" . (lambda () (org-extras-jump-to-latest-clock-entry) (org-extras-clone-clock-entry)))
("i" . (org-clock-in))
("o" . (org-clock-out))
("Regular editing")
("z" . (undo-only))
("X" . (org-cut-subtree)) ; capital 'X' to prevent accidents
("c" . (org-copy-subtree))
("v" . (org-yank))
("Other")
("I" . (org-id-copy))
("p" . (org-priority))
("u" . (org-speed-command-help))
("g" . (org-agenda)))))
```
`ol` {#ol}
_[ol](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/ol.el) implements the org link framework._
```emacs-lisp
(use-feature ol
:after org
:custom
(org-link-search-must-match-exact-headline nil)
(org-ellipsis " ")
:bind
("H-L" . org-store-link))
```
`ol-bbdb` {#ol-bbdb}
_[ol-bbdb](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/ol-bbdb.el) provides support for links to BBDB records in org-mode._
```emacs-lisp
(use-feature ol-bbdb
:after org bbdb ol
:custom
(org-bbdb-anniversary-field 'birthday))
```
`org-protocol` {#org-protocol}
_[org-protocol](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-protocol.el) intercepts calls from emacsclient to trigger custom actions._
[This section of the org-roam manual](https://www.orgroam.com/manual.html#Mac-OS) describes how to set up `org-protocol` on macOS. Note that [emacs-mac](https://bitbucket.org/mituharu/emacs-mac/) supports `org-protocol` out of the box and doesn't require turning on the Emacs server.
```emacs-lisp
(use-feature org-protocol
:after org)
```
`ox` {#ox}
_[ox](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/ox.el) is the generic export engine for org-mode._
```emacs-lisp
(use-feature ox
:after org
:defer t
:custom
(org-export-exclude-tags '("noexport" "ARCHIVE"))
(org-export-with-broken-links 'mark) ; allow export with broken links
(org-export-with-section-numbers nil) ; do not add numbers to section headings
(org-export-with-toc nil) ; do not include table of contents
(org-export-with-title nil) ; do not include title
(org-export-headline-levels 4) ; include up to level 4 headlines
(org-export-with-tasks nil) ; exclude headings with TODO keywords
(org-export-with-todo-keywords nil) ; do not render TODO keywords in headings
(org-export-preserve-breaks t) ; respect single breaks when exporting
;; (org-export-with-author nil "do not include author")
;; (org-export-with-date nil "do not include export date")
;; (org-html-validation-link nil "do not include validation link")
(org-export-show-temporary-export-buffer nil)) ; bury temporary export buffers generated by `org-msg'
```
`ox-html` {#ox-html}
_[ox-html](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/ox-html.el) is the HTML back-end for the org export engine._
```emacs-lisp
(use-feature ox-html
:after org ox
:custom
(org-html-postamble nil)) ; the three lines above unnecessary when this set to nil
```
`ox-latex` {#ox-latex}
_[ox-latex](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/ox-latex.el) is the LaTeX back-end for the org export engine._
```emacs-lisp
(use-feature ox-latex
:after org ox
:custom
;; get rid of temporary LaTeX files upon export
(org-latex-logfiles-extensions (quote
("lof" "lot" "tex" "aux" "idx" "log" "out" "toc" "nav" "snm" "vrb" "dvi" "fdb_latexmk" "blg" "brf" "fls" "entoc" "ps" "spl" "bbl" "pygtex" "pygstyle"))))
```
`ox-hugo` {#ox-hugo}
_[ox-hugo](https://github.com/kaushalmodi/ox-hugo) is an org-mode exporter back-end for Hugo._
Hugo should be able to export `org-cite` citations.
```emacs-lisp
(use-package ox-hugo
:after org ox
:demand t
:custom
(org-hugo-default-section-directory "posts"))
```
`ox-hugo-extras` {#ox-hugo-extras}
_[ox-hugo-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/ox-hugo-extras.el) collects my extensions for `ox-hugo`._
```emacs-lisp
(use-personal-package ox-hugo-extras
:after ox-hugo
:demand t)
```
`stafforini` {#stafforini}
_[stafforini](https://github.com/benthamite/stafforini.el) is a private package that provides Emacs commands for building and previewing my personal site, built with Hugo._
```emacs-lisp
(use-package stafforini
:ensure (:host github :repo "benthamite/stafforini.el")
:defer t
:bind (("A-H-s" . stafforini-menu)))
```
`ox-pandoc` {#ox-pandoc}
_[ox-pandoc](https://github.com/kawabata/ox-pandoc) is an org-mode exporter that uses Pandoc._
```emacs-lisp
(use-package ox-pandoc
:after org ox
:defer t)
```
`ox-gfm` {#ox-gfm}
_[ox-gfm](https://github.com/larstvei/ox-gfm) is a Github Flavored Markdown org-mode exporter._
```emacs-lisp
(use-package ox-gfm
:after org ox
:defer t)
```
`ob` {#ob}
_ob provides support for code blocks in org-mode._
Note to self: Typescript syntax highlighting works fine, but calling `org-edit-special` triggers an error; see the gptel conversation named `typescript-syntax-highlighting`.
```emacs-lisp
(use-feature ob
:after org
:defer t
:custom
(org-confirm-babel-evaluate 'org-extras-confirm-babel-evaluate)
(org-export-use-babel t)
:config
;; enable lexical binding in code blocks
(setcdr (assq :lexical org-babel-default-header-args:emacs-lisp) "no")
;; Prevent code-block evaluation during export while preserving noweb expansion
(setf (alist-get :eval org-babel-default-header-args) "never-export")
(setq org-babel-default-header-args:python
'((:exports . "both")
(:results . "replace output")
(:session . "none")
(:cache . "no")
(:noweb . "no")
(:hlines . "no")
(:tangle . "no")))
(org-babel-do-load-languages
'org-babel-load-languages
'((emacs-lisp . t)
(shell . t)
(python . t)
(R . t)))
(dolist (cons (list (cons "j" 'org-babel-next-src-block)
(cons "k" 'org-babel-previous-src-block)
(cons "n" 'org-babel-insert-header-arg)
(cons "p" 'org-babel-remove-result-one-or-many)))
(add-to-list 'org-babel-key-bindings cons)))
```
`ob-typescript` {#ob-typescript}
_[ob-typescript](https://github.com/lurdan/ob-typescript) enables the execution of typescript code blocks._
```emacs-lisp
(use-package ob-typescript
:after ob
:config
(setq org-babel-default-header-args:typescript
'((:mode . typescript-ts-mode)))
(add-to-list 'org-babel-load-languages '(typescript . t))
(add-to-list 'major-mode-remap-alist '(typescript-mode . typescript-ts-mode))
(defalias 'typescript-mode 'typescript-ts-mode))
```
`org-tempo` {#org-tempo}
_org-tempo provides completion templates for org-mode._
```emacs-lisp
(use-feature org-tempo
:after org)
```
`org-src` {#org-src}
_[org-src](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-src.el) manages source code blocks in org-mode._
```emacs-lisp
(use-feature org-src
:after org
:custom
(org-edit-src-content-indentation 0)
(org-src-preserve-indentation nil)
(org-src-window-setup 'current-window)
(org-src-tab-acts-natively nil) ; When set to `nil', newlines will be properly indented
:bind
(:map org-src-mode-map
("s-z" . org-edit-src-exit)))
```
`org-table` {#org-table}
_[org-table](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-table.el) is the plain-text table editor for org-mode._
```emacs-lisp
(use-feature org-table
:after org
:bind
(:map org-table-fedit-map
("s-c" . org-table-fedit-finish)))
```
`org-table-wrap` {#org-table-wrap}
_[org-table-wrap](https://github.com/benthamite/org-table-wrap) provides visual word-wrapping for org mode tables._
```emacs-lisp
(use-package org-table-wrap
:ensure (:host github
:repo "benthamite/org-table-wrap")
:after org-table
:demand t
:custom
(org-table-wrap-width-fraction 0.8)
:hook
(org-mode-hook . org-table-wrap-mode))
```
`orgtbl-edit` {#orgtbl-edit}
_[orgtbl-edit](https://github.com/shankar2k/orgtbl-edit) allows editing a spreadsheet or text-delimited file as an org table._
```emacs-lisp
(use-package orgtbl-edit
:ensure (:host github
:repo "shankar2k/orgtbl-edit")
:after org-table
:defer t)
```
`orgtbl-join` {#orgtbl-join}
_[orgtbl-join](https://github.com/tbanel/orgtbljoin) joins two org tables on a common column._
```emacs-lisp
(use-package orgtbl-join
:after org-table
:defer t)
```
`org-crypt` {#org-crypt}
_[org-crypt](https://orgmode.org/manual/Org-Crypt.html) encrypts the text under all headlines with a designated tag._
```emacs-lisp
(use-feature org-crypt
:after org
:defer t
:custom
(org-crypt-key (getenv "PERSONAL_GMAIL"))
(org-crypt-disable-auto-save t)
:config
(org-crypt-use-before-save-magic))
```
`org-element` {#org-element}
_[org-element](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-element.el) implements a parser and an API for org syntax._
```emacs-lisp
(use-feature org-element
:after org
:custom
;; set to nil to temporarily disable cache and avoid `org-element-cache' errors
(org-element-use-cache t)
:config
;; `org-capture' creates an indirect buffer with `clone', which copies
;; the base buffer's cache. The cloned cache is immediately stale
;; because the capture buffer is narrowed and has the template inserted.
;; Reset it so the cache is rebuilt from scratch in the capture buffer.
(add-hook 'org-capture-mode-hook #'org-element-cache-reset)
;; `org-element-at-point' wraps its cache recovery code in
;; `condition-case-unless-debug', which is disabled when
;; `debug-on-error' is non-nil. This causes every cache error—stale
;; entries, invalid search bounds, missing parents—to enter the
;; debugger instead of being handled by Org's built-in reset-and-retry
;; logic. Temporarily binding `debug-on-error' to nil lets the
;; existing recovery work while preserving the debugger for all other
;; code.
(define-advice org-element-at-point (:around (fn &rest args) let-cache-recovery-work)
(let ((debug-on-error nil))
(apply fn args))))
```
`org-lint` {#org-lint}
_[org-lint](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-lint.el) checks org files for common errors._
```emacs-lisp
(use-feature org-lint
:after org
:defer t)
```
`org-habit` {#org-habit}
_[org-habit](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-habit.el) tracks habits and displays consistency graphs in the agenda._
```emacs-lisp
(use-feature org-habit
:after org org-agenda
:custom
(org-habit-today-glyph #x1f4c5)
(org-habit-completed-glyph #x2713)
(org-habit-preceding-days 25)
(org-habit-following-days 1)
(org-habit-graph-column 85)
(org-habit-show-habits nil)
(org-habit-show-habits-only-for-today nil))
```
`org-contrib` {#org-contrib}
_[org-contrib](https://git.sr.ht/~bzg/org-contrib) features add-ons to `org-mode`._
```emacs-lisp
(use-package org-contrib
:after org)
```
`org-checklist` {#org-checklist}
_[org-checklist](https://git.sr.ht/~bzg/org-contrib/tree/main/item/lisp/org-checklist.el) resets checkboxes in repeating org entries._
Allows reset of checkboxes in recurring tasks. This works only on headings that have the property `RESET_CHECK_BOXES` set to `t`. You can set the property of a heading by invoking the command `org-set-property` with point on that heading or immediately under it.
```emacs-lisp
(use-feature org-checklist
:after org-contrib
:defer 30)
```
`org-make-toc` {#org-make-toc}
_[org-make-toc](https://github.com/alphapapa/org-make-toc) generates automatic tables of contents for org files._
```emacs-lisp
(use-package org-make-toc
:after org
:defer 30)
```
`org-journal` {#org-journal}
_[org-journal](https://github.com/bastibe/org-journal) is an org-mode based journaling mode._
```emacs-lisp
(use-package org-journal
:after org
:custom
(org-journal-dir paths-dir-journal)
(org-journal-date-format "%A, %d %B %Y")
(org-journal-file-format "%Y-%m-%d.org")
:config
(defun org-journal-new-entry-in-journal ()
"Create a new journal entry in the selected journal."
(interactive)
(let* ((journal-dirs (list paths-dir-tlon-todos paths-dir-journal))
(cons (mapcar (lambda (dir)
(cons (file-name-nondirectory (directory-file-name dir)) dir))
journal-dirs))
(choice (completing-read "Journal: " cons))
(org-journal-dir (alist-get choice cons nil nil 'string=)))
(org-journal-new-entry nil)))
:bind
("A-j" . org-journal-new-entry-in-journal))
```
`org-contacts` {#org-contacts}
_[org-contacts](https://repo.or.cz/org-contacts.git) is a contacts management system for Org Mode._
```emacs-lisp
(use-package org-contacts
:ensure (:build (:not elpaca-check-version))
:after org tlon
:defer t
:custom
(org-contacts-files `(,(file-name-concat paths-dir-tlon-repos "babel-core/contacts.org"))))
```
`org-vcard` {#org-vcard}
_[org-vcard](https://github.com/pinoaffe/org-vcard) imports and exports vCards from within org-mode._
```emacs-lisp
(use-package org-vcard
:defer t
:custom
;; optimized for macOS Contacts.app
(org-vcard-styles-languages-mappings
'(("flat"
(("en"
(("3.0"
(("ADDRESS_HOME" . "ADR;TYPE=\"home\";PREF=1")
("ADDRESS_HOME" . "ADR;TYPE=\"home\"")
("BIRTHDAY" . "BDAY")
("EMAIL" . "EMAIL;PREF=1")
("FN" . "FN")
("PHOTO" . "PHOTO;TYPE=JPEG")
("URL" . "item1.URL;type=pref")))
("4.0"
(("ADDRESS_HOME" . "ADR;TYPE=\"home\";PREF=1")
("ADDRESS_HOME" . "ADR;TYPE=\"home\"")
("BIRTHDAY" . "BDAY")
("EMAIL" . "EMAIL;PREF=1")
("FN" . "FN")
("PHOTO" . "PHOTO;TYPE=JPEG")
("URL" . "URL;PREF=1")))
("2.1"
(("ADDRESS_HOME" . "ADR;HOME;PREF")
("ADDRESS_HOME" . "ADR;HOME")
("BIRTHDAY" . "BDAY")
("EMAIL" . "EMAIL;PREF")
("FN" . "FN")
("PHOTO" . "PHOTO;TYPE=JPEG")
("URL" . "URL;PREF")))))))))
:config
(defun org-vcard--save-photo-and-make-link (base64-data contact-name)
"Save base64 photo data to file and return org link."
(let* ((photo-dir (expand-file-name "contact-photos" user-emacs-directory))
(safe-name (replace-regexp-in-string "[^a-zA-Z0-9]" "_" contact-name))
(filename (expand-file-name (concat safe-name ".jpg") photo-dir)))
(unless (file-directory-p photo-dir)
(make-directory photo-dir t))
(with-temp-file filename
(set-buffer-multibyte nil) ; Use unibyte mode for binary data
(let ((coding-system-for-write 'binary))
;; Remove potential MIME type header if present
(when (string-match "^data:image/[^;]+;base64," base64-data)
(setq base64-data (substring base64-data (match-end 0))))
;; Process base64 data in chunks
(let ((chunk-size 4096)
(start 0)
(total-length (length base64-data)))
(while (< start= total-length)= (let*= ((end= (min= (+= start= chunk-size)= total-length))= (chunk= (substring= base64-data= start= end)))= (insert= (base64-decode-string= chunk))= (setq= start= end))))))= (format= "\n#+ATTR_ORG:= :width= 300\n[[file:%s]]\n"= filename)))= (advice-add= 'org-vcard--transfer-write= :around= (lambda= (orig-fun= direction= content= destination)= "Convert= PHOTO= properties= to= image= links= before= writing."= (if= (eq= direction= 'import)= (let= ((modified-content= (with-temp-buffer= (insert= content)= (goto-char= (point-min))= (while= (re-search-forward= "^:PHOTO:= \\(.+\\)$"= nil= t)= (let*= ((base64-data= (match-string= 1))= (heading= (save-excursion= (re-search-backward= "^\\*= \\(.+\\)$"= nil= t)= (match-string= 1)))= (photo-link= (condition-case= err= (org-vcard--save-photo-and-make-link= base64-data= heading)= (error= (message= "Error= processing= photo= for= %s:= %s"= heading= (error-message-string= err))= ":PHOTO:= [Error= processing= photo]\n"))))= (delete-region= (line-beginning-position)= (line-end-position))= ;;= Delete= any= following= empty= line= (when= (looking-at= "\n")= (delete-char= 1))= (save-excursion= (re-search-forward= ":END:\n"= nil= t)= (insert= photo-link))))= (buffer-string))))= (funcall= orig-fun= direction= modified-content= destination))= (funcall= orig-fun= direction= content= destination)))))= ```= `org-autosort`= {#org-autosort}= _[org-autosort](https://github.com/yantar92/org-autosort)= sorts= entries= in= org= files= automatically._= ```emacs-lisp= (use-package= org-autosort= :ensure= (:host= github= :repo= "yantar92/org-autosort")= :after= org= :defer= 30)= ```= `ox-clip`= {#ox-clip}= _[ox-clip](https://github.com/jkitchin/ox-clip)= copies= selected= regions= in= org-mode= as= formatted= text= on= the= clipboard._= ```emacs-lisp= (use-package= ox-clip= :after= org= :defer= t= :custom= ;;= github.com/jkitchin/ox-clip/issues/13= (ox-clip-osx-cmd= '(("HTML"= .= "hexdump= -ve= '1/1= \"%.2x\"'= |= xargs= printf= \"set= the= clipboard= to= {text:\\\"= \\\",= «class= HTML»:«data= HTML%s»}\"= |= osascript= -")= ("Markdown"= .= "pandoc= --wrap=none -f= html= -t= \"markdown-smart+hard_line_breaks\"= -= |= grep= -v= \"^:::\"= |= sed= 's/{#.*}//g'= |= sed= 's/\\\\`/`/g'= |= pbcopy"))))= ```= `elgantt`= {#elgantt}= _[elgantt](https://github.com/legalnonsense/elgantt/)= is= a= gantt= chart= for= org= mode._= ```emacs-lisp= (use-package= elgantt= :ensure= (:host= github= :repo= "legalnonsense/elgantt")= :after= org= :defer= t)= ```= `org-pomodoro`= {#org-pomodoro}= _[org-pomodoro](https://github.com/marcinkoziej/org-pomodoro)= provides= org-mode= support= for= the= Pomodoro= technique._= ```emacs-lisp= (use-package= org-pomodoro= :after= org= org-agenda= :custom= (org-pomodoro-length= 25)= (org-pomodoro-short-break-length= 5)= (org-pomodoro-long-break-length= org-pomodoro-short-break-length)= (org-pomodoro-finished-sound= "/System/Library/Sounds/Blow.aiff")= (org-pomodoro-long-break-sound= org-pomodoro-finished-sound)= (org-pomodoro-short-break-sound= org-pomodoro-finished-sound)= :hook= (org-pomodoro-started-hook= .= org-extras-pomodoro-format-timer)= (org-pomodoro-started-hook= .= tab-bar-extras-disable-all-notifications)= (org-pomodoro-started-hook= .= org-extras-narrow-to-entry-and-children)= (org-pomodoro-finished-hook= .= tab-bar-extras-enable-all-notifications)= (org-pomodoro-finished-hook= .= widen)= :bind= (("H-I"= .= org-pomodoro)= ("M-s-e"= .= org-pomodoro-extend-last-clock)))= ```= -= check:= <https://gist.github.com/bravosierrasierra/1d98a89a7bcb618ef70c6c4a92af1a96#file-org-pomodoro-plus=>
`org-pomodoro-extras` {#org-pomodoro-extras}
_[org-pomodoro-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-pomodoro-extras.el) collects my extensions for `org-pomodoro`._
```emacs-lisp
(use-personal-package org-pomodoro-extras
:after org-pomodoro)
```
`org-percentile` {#org-percentile}
_[org-percentile](https://github.com/benthamite/org-percentile) is a productivity tool that let’s you compete with your past self._
```emacs-lisp
(use-package org-percentile
:disabled
:ensure (:host github
:repo "benthamite/org-percentile")
:after org-clock
:custom
(org-percentile-data-file (file-name-concat paths-dir-dropbox "misc/org-percentile-data.el"))
:config
(org-percentile-mode))
```
note-taking {#note-taking}
`org-roam` {#org-roam}
_[org-roam](https://github.com/org-roam/org-roam) is a Roam replica with org-mode._
```emacs-lisp
(use-package org-roam
:ensure (:host github
:repo "benthamite/org-roam"
:branch "fix/handle-nil-db-version") ; https://github.com/org-roam/org-roam/pull/2609
:after org
:custom
(org-roam-directory paths-dir-org-roam)
(org-roam-node-display-template
(concat "${title:*} "
(propertize "${tags:10}" 'face 'org-tag)))
(org-roam-node-display-template
(concat "${hierarchy:160} "
(propertize "${tags:20}" 'face 'org-tag)))
;; exclude selected headings based on other criteria
(org-roam-db-node-include-function #'org-roam-extras-node-include-function)
:config
(with-eval-after-load 'org-roam-bibtex
(setopt org-roam-mode-sections (append org-roam-mode-sections '(orb-section-reference orb-section-abstract))))
;; include transcluded links in `org-roam' backlinks
(delete '(keyword "transclude") org-roam-db-extra-links-exclude-keys)
;; https://github.com/org-roam/org-roam/issues/2550#issuecomment-3451456331
(with-eval-after-load 'org-roam-capture
(setopt org-roam-capture-new-node-hook nil))
:bind
(:map org-mode-map
("s-i" . org-roam-node-insert)
:map org-roam-mode-map
("f" . ace-link-org)))
```
`org-roam-extras` {#org-roam-extras}
_[org-roam-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-roam-extras.el) collects my extensions for `org-roam`._
```emacs-lisp
(use-personal-package org-roam-extras
:custom
(org-roam-extras-auto-show-backlink-buffer t)
:config
;; exclude headings in specific files and directories
;; Set here rather than in :custom so that `org-roam-extras-excluded-dirs'
;; and `org-roam-extras-excluded-files' are guaranteed to be defined (they
;; are void if org-roam loads first, e.g. via a citar idle timer).
(setopt org-roam-file-exclude-regexp
(let (result)
(dolist (dir-or-file
(append
org-roam-extras-excluded-dirs
org-roam-extras-excluded-files)
(regexp-opt result))
(push (if (file-directory-p dir-or-file)
(file-relative-name dir-or-file paths-dir-org-roam)
dir-or-file)
result))))
(org-roam-extras-setup-db-sync)
:hook
(org-capture-prepare-finalize-hook . org-roam-extras-remove-file-level-properties)
:bind
(("H-N" . org-roam-extras-new-note)
("H-j" . org-roam-extras-node-find)
("H-J" . org-roam-extras-node-find-special)))
```
`org-roam-ui` {#org-roam-ui}
_[org-roam-ui](https://github.com/org-roam/org-roam-ui) is a graphical frontend for exploring org-roam._
```emacs-lisp
(use-package org-roam-ui
:ensure (:host github
:repo "org-roam/org-roam-ui"
:branch "main"
:files ("*.el" "out"))
:after org-roam
:defer t
:custom
(org-roam-ui-sync-theme t)
(org-roam-ui-follow t)
(org-roam-ui-update-on-save nil)
(org-roam-ui-open-on-start nil))
```
`org-transclusion` {#org-transclusion}
_[org-transclusion](https://github.com/nobiot/org-transclusion) supports [transclusion](https://en.wikipedia.org/wiki/Transclusion) with org-mode._
```emacs-lisp
(use-package org-transclusion
:after org
:defer t
:config
(dolist (element '(headline drawer property-drawer))
(push element org-transclusion-exclude-elements))
(face-spec-set 'org-transclusion-fringe
'((((background light))
:foreground "black")
(t
:foreground "white"))
'face-override-spec)
(face-spec-set 'org-transclusion-source-fringe
'((((background light))
:foreground "black")
(t
:foreground "white"))
'face-override-spec))
```
`vulpea` {#vulpea}
_[vulpea](https://github.com/d12frosted/vulpea) is a collection of functions for note-taking based on `org` and `org-roam`._
I use this package to define a dynamic agenda, as explained and illustrated [here](https://d12frosted.io/posts/2021-01-16-task-management-with-roam-vol5.html). I've made some changes to the system in that link, specifically to exclude files and directories at various stages:
1. At the broadest level, I exclude files and directories from the function (`org-extras-id-auto-add-ids-to-headings-in-file`) that otherwise automatically adds an ID to every org heading in a file-visiting buffer. Headings so excluded are not indexed by org-roam, because a heading requires an ID to be indexed. For details, see that function’s docstring. For examples of how this is used in my config, see the variables `org-extras-id-auto-add-excluded-files` and `org-extras-id-auto-add-excluded-directories` under the `org-id` section of this file.
2. I then exclude some headings with IDs from the org-roam database. For examples of how this is used in my config, see the variables `org-roam-file-exclude-regexp` and `org-roam-db-node-include-function` under the `org-roam` section of this file.
3. Finally, I selectively include in `org-agenda-files` files that satisfy certain conditions (as defined by `vulpea-extras-project-p`) and files modified recently (as specified by `org-roam-extras-recent`), and exclude from `org-agenda-files` files listed in `org-extras-agenda-files-excluded`.
```emacs-lisp
(use-package vulpea
:after org org-roam)
```
`vulpea-extras` {#vulpea-extras}
_[vulpea-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/vulpea-extras.el) collects my extensions for `vulpea`._
```emacs-lisp
(use-personal-package vulpea-extras
:hook
((find-file-hook before-save-hook) . vulpea-extras-project-update-tag))
```
`org-noter` {#org-noter}
_[org-noter](https://github.com/org-noter/org-noter) is an org-mode document annotator._
```emacs-lisp
(use-package org-noter
:ensure (:host github
:repo "org-noter/org-noter")
:after org-extras
:init
(with-eval-after-load 'pdf-annot
(bind-keys :map pdf-annot-minor-mode-map
("s-s" . org-noter-create-skeleton)))
:custom
(org-noter-notes-search-path `(,paths-dir-bibliographic-notes))
(org-noter-auto-save-last-location t)
(org-noter-always-create-frame nil)
(org-noter-separate-notes-from-heading t)
(org-noter-kill-frame-at-session-end nil)
(org-noter-use-indirect-buffer nil)
:config
(push paths-file-orb-noter-template org-extras-id-auto-add-excluded-files)
:bind
(:map org-noter-notes-mode-map
("s-n" . org-noter-sync-current-note)))
```
- To check:<https://org-roam.discourse.group/t/org-roam-bibtex-in-a-sub-directory/649/5>
-<https://notes.andymatuschak.org/About_these_notes>
`org-noter-extras` {#org-noter-extras}
_[org-noter-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-noter-extras.el) collects my extensions for `org-noter`._
```emacs-lisp
(use-personal-package org-noter-extras
:after org-noter
:demand t
:bind
(:map org-noter-notes-mode-map
("s-a" . org-noter-extras-cleanup-annotation)
("s-d" . org-noter-extras-dehyphenate)
("s-k" . org-noter-extras-sync-prev-note)
("s-l" . org-noter-extras-sync-next-note)
("s-o" . org-noter-extras-highlight-offset)))
```
spaced-repetition {#spaced-repetition}
`anki-editor` {#anki-editor}
_[anki-editor](https://github.com/anki-editor/anki-editor) is a minor mode for making Anki cards with Org Mode._
The [original package](https://github.com/louietan/anki-editor) is abandoned, but there is an actively maintained [fork](https://github.com/anki-editor/anki-editor).
```emacs-lisp
(use-package anki-editor
:ensure (:host github
:repo "anki-editor/anki-editor")
:defer t
:custom
;; https://github.com/anki-editor/anki-editor/issues/116
(anki-editor-export-note-fields-on-push nil)
(anki-editor-org-tags-as-anki-tags nil))
```
`anki-editor-extras` {#anki-editor-extras}
_[anki-editor-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/anki-editor-extras.el) collects my extensions for `anki-editor`._
```emacs-lisp
(use-personal-package anki-editor-extras
:after anki-editor
:demand t)
```
`ankiorg` {#ankiorg}
_[ankiorg](https://github.com/orgtre/ankiorg) is an anki-editor add-on which pulls Anki notes to Org._
```emacs-lisp
(use-package ankiorg
:ensure (:host github :repo "orgtre/ankiorg")
:demand t
:after anki-editor
:custom
(ankiorg-sql-database
(file-name-concat no-littering-var-directory "ankiorg/collection.anki2"))
(ankiorg-media-directory
no-littering-var-directory "ankiorg/img")
(ankiorg-pick-deck-all-directly t))
```
`anki-noter` {#anki-noter}
_[anki-noter](https://github.com/benthamite/anki-noter) AI-powered Anki notes generator._
```emacs-lisp
(use-package anki-noter
:ensure (:host github
:repo "benthamite/anki-noter")
:defer t)
```
reference &amp; citation {#reference-and-citation}
See [this section](https://github.com/emacs-citar/citar/wiki/Comparisons#summary-of-diverse-emacs-bibliographic-and-citation-packages) of citar's manual for a handy summary of the main bibliographic and citation Emacs packages.
I split my bibliographies into two categories: personal and work. The files providing my personal bibliography are defined in `paths-files-bibliography-personal`. The files providing my work bibliography are defined in `tlon-bibliography-files`. I then define `paths-files-bibliography-all` as the concatenation of these two lists. Finally, this master variable is used to set the values of the user options for all package that define bibliographies:
- `bibtex-files` (for `bibtext`)
- `bibtex-completion-bibliography` (for `bibtex-completion`)
- `citar-bibliography` (for `citar`)
- `ebib-preload-bib-files` (for `ebib`)
Each of these packages requires `tlon`, since the latter must load for `paths-files-bibliography-all` to be set.
`oc` {#oc}
_[oc](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/oc.el) is Org mode's built-in citation handling library._
```emacs-lisp
(use-feature oc
:after org el-patch
:defer t
:custom
(org-cite-insert-processor 'citar)
(org-cite-follow-processor 'citar) ; `org-open-at-point' integration
(org-cite-activate-processor 'citar) ;
(org-cite-export-processors
'((t . (csl "long-template.csl")))))
```
`oc-csl` {#oc-csl}
_[oc-csl](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/oc-csl.el) is a CSL citation processor for Org mode's citation system._
```emacs-lisp
(use-feature oc-csl
:after oc
:defer t
:custom
(org-cite-csl-styles-dir paths-dir-tlon-csl-styles)
(org-cite-csl-locales-dir paths-dir-tlon-csl-locales))
```
`org-footnote` {#org-footnote}
_[org-footnote](https://github.com/emacs-mirror/emacs/blob/master/lisp/org/org-footnote.el) provides footnote support in org-mode._
```emacs-lisp
(use-feature org-footnote
:defer t
:custom
(org-footnote-auto-adjust t))
```
`citeproc` {#citeproc}
_[citeproc](https://github.com/andras-simonyi/citeproc-el) is a CSL 1.0.2 Citation Processor for Emacs._
```emacs-lisp
(use-package citeproc
:after oc
:defer t)
```
`bibtex` {#bibtex}
_bibtex is major mode for editing and validating BibTeX `.bib` files._
```emacs-lisp
(use-feature bibtex
:after ebib
:custom
(bibtex-files paths-files-bibliography-all)
;; This corresponds (roughly?) to `auth+year+shorttitle(3,3)' on Better BibTeX
;; retorque.re/zotero-better-bibtex/citing/
(bibtex-search-entry-globally t)
(bibtex-autokey-names 1)
(bibtex-autokey-name-case-convert-function 'capitalize)
(bibtex-autokey-year-length 4)
(bibtex-autokey-year-title-separator "")
(bibtex-autokey-title-terminators "[.!?;]\\|--")
(bibtex-autokey-titlewords 3)
(bibtex-autokey-titlewords-stretch 0)
(bibtex-autokey-titleword-case-convert-function 'capitalize)
(bibtex-autokey-titleword-length nil)
(bibtex-autokey-titleword-separator "")
(bibtex-autokey-titleword-ignore '("A" "a" "An" "an" "On" "on" "The" "the" "Eine?" "Der" "Die" "Das" "El" "La" "Lo" "Los" "Las" "Un" "Una" "Unos" "Unas" "el" "la" "lo" "los" "las" "un" "una" "unos" "unas" "y" "o" "Le" "La" "L'" "Les" "Un" "Une" "Des" "Du" "De la" "De l'" "Des" "le" "la" "l'" "les" "un" "une" "des" "du" "de la" "de l'" "des" "Lo" "Il" "La" "L'" "Gli" "I" "Le" "Uno" "lo" "il" "la" "l'" "gli" "i" "le" "uno"))
;; Remove accents
(bibtex-autokey-before-presentation-function 'simple-extras-asciify-string)
;; check tweaked version of `bibtex-format-entry' above
(bibtex-entry-format '(opts-or-alts-fields last-comma delimiters page-dashes))
(bibtex-field-indentation 8) ; match ebib value
:config
(require 'tlon) ; see explanatory note under ‘reference & citation’
(push '("\\." . "") bibtex-autokey-name-change-strings)
;; add extra entry types
(dolist (entry '(("Video" . "Video file")
("Movie" . "Film")
("tvepisode" . "TV episode")))
(push `(,(car entry) ,(cdr entry)
(("author" nil nil 0)
("title")
("date" nil nil 1)
("year" nil nil -1)
("url" nil nil 2))
nil
(("abstract")
("keywords")
("language")
("version")
("rating")
("letterboxd")
("note")
("organization")
("eprintclass" nil nil 4)
("primaryclass" nil nil -4)
("eprinttype" nil nil 5)
("archiveprefix" nil nil -5)
("urldate")))
bibtex-biblatex-entry-alist))
:bind
(:map bibtex-mode-map
("s-f" . ebib-extras-open-file-dwim)
("s-/" . ebib-extras-attach-most-recent-file)
("s-a" . bibtex-set-field)
("s-c" . bibtex-copy-entry-as-kill)
("s-v" . bibtex-yank)
("s-x" . bibtex-kill-entry)
("A-C-H-x" . bibtex-copy-entry-as-kill)
("A-C-H-c" . bibtex-kill-entry)
("A-C-H-a" . bibtex-copy-field-as-kill)
("A-C-H-f" . bibtex-kill-field)
("A-C-s-r" . bibtex-previous-entry)
("A-C-s-f" . bibtex-next-entry)))
```
`bibtex-extras` {#bibtex-extras}
_[bibtex-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/bibtex-extras.el) collects my extensions for `bibtex`._
```emacs-lisp
(use-personal-package bibtex-extras
:after bibtex
:demand t
:custom
(bibtex-maintain-sorted-entries
'(bibtex-extras-entry-sorter bibtex-extras-lessp))
:config
;; Replace 'online' entry type
(bibtex-extras-replace-element-by-name
bibtex-biblatex-entry-alist
"Online" '("Online" "Online Resource"
(("author" nil nil 0) ("title") ("journaltitle" nil nil 3)
("date" nil nil 1) ("year" nil nil -1)
("doi" nil nil 2) ("url" nil nil 2))
nil
(("subtitle") ("language") ("version") ("note")
("organization") ("month")
("pubstate") ("eprintclass" nil nil 4) ("primaryclass" nil nil -4)
("eprinttype" nil nil 5) ("archiveprefix" nil nil -5) ("urldate"))))
(add-to-list 'bibtex-biblatex-entry-alist
'("Performance" "A performance entry"
(("author") ("title") ("date")) ; Required fields
nil ; Crossref fields
(("venue") ("location") ("note"))))
:bind
(:map bibtex-mode-map
("s-a" . bibtex-extras-set-field)
("s-d" . bibtex-extras-url-to-pdf-attach)
("s-h" . bibtex-extras-url-to-html-attach)
("s-i" . bibtex-extras-open-in-ebib)
("s-t" . bibtex-extras-move-entry-to-tlon)))
```
`bibtex-completion` {#bibtex-completion}
_[bibtex-completion](https://github.com/tmalsburg/helm-bibtex) is a backend for searching and managing bibliographies in Emacs._
The package is required by org-roam-bibtex.
```emacs-lisp
(use-package bibtex-completion
:ensure (:version (lambda (_) "2.0.0")) ; github.com/progfolio/elpaca/issues/229
:after bibtex
:custom
(bibtex-completion-bibliography paths-files-bibliography-all)
(bibtex-completion-pdf-open-function 'find-file)
(bibtex-completion-notes-path paths-dir-bibliographic-notes)
(bibtex-completion-pdf-field "file")
(bibtex-dialect 'biblatex)
(bibtex-completion-library-path paths-dir-pdf-library)
:config
(require 'tlon)) ; see explanatory note under ‘reference & citation’
```
`bibtex-completion-extras` {#bibtex-completion-extras}
_[bibtex-completion-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/bibtex-completion-extras.el) collects my extensions for `bibtex-completion`._
```emacs-lisp
(use-personal-package bibtex-completion-extras
:after bibtex-completion)
```
`org-roam-bibtex` {#org-roam-bibtex}
_[org-roam-bibtex](https://github.com/org-roam/org-roam-bibtex) integrates org-roam and bibtex._
```emacs-lisp
(use-package org-roam-bibtex
:after bibtex-completion org-roam
:custom
(orb-roam-ref-format 'org-cite)
(orb-insert-interface 'citar-open-notes)
(orb-note-actions-interface 'default)
(orb-attached-file-extensions '("pdf"))
(org-roam-capture-templates
`(("r" "bibliography reference" plain
(file ,paths-file-orb-noter-template)
:if-new
(file ,paths-file-orb-capture-template)
:unnarrowed t :immediate-finish t :jump-to-captured t)))
:config
(dolist (keyword '("year" "title" "url" "keywords"))
(add-to-list 'orb-preformat-keywords keyword))
(org-roam-bibtex-mode)
;; https://github.com/org-roam/org-roam/issues/2550#issuecomment-3451456331
(setq org-roam-capture-new-node-hook nil))
```
`citar` {#citar}
_[citar](https://github.com/emacs-citar/citar) is a package to quickly find and act on bibliographic references, and edit org, markdown, and latex academic documents._
We defer-load the package to activate the timer that in turn updates the bibliography files when Emacs is idle, like we do with `ebib` below.
```emacs-lisp
(use-package citar
:ensure (:host github
:repo "emacs-citar/citar"
:includes (citar-org))
:defer 30
:custom
(citar-bibliography paths-files-bibliography-all)
(citar-notes-paths `(,paths-dir-bibliographic-notes))
(citar-at-point-function 'embark-act)
(citar-symbol-separator " ")
(citar-format-reference-function 'citar-citeproc-format-reference)
(citar-templates '((main . "${author editor:30%sn} ${date year issued:4} ${title:60} ${database:10}")
(suffix . " ${=key= id:25} ${=type=:12}")
(preview . "${author editor:%etal} (${year issued date}) ${title}, ${journal journaltitle publisher container-title collection-title}.\n")
(note . "Notes on ${author editor:%etal}, ${title}")))
(citar-notes-source 'orb-citar-source)
:config
(require 'tlon) ; see explanatory note under ‘reference & citation’
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'citar-history))
(require 'citar-org-roam)
(citar-register-notes-source
'orb-citar-source (list :name "Org-Roam Notes"
:category 'org-roam-node
:items #'citar-org-roam--get-candidates
:hasitems #'citar-org-roam-has-notes
:open #'citar-org-roam-open-note
:create #'orb-citar-edit-note
:annotate #'citar-org-roam--annotate))
;; allow invocation of `citar-insert-citation' in any buffer. Although it is
;; not possible to insert citations in some modes, it is still useful to be
;; able to run this command because of the `embark' integration
(setf (alist-get 't citar-major-mode-functions)
(cons '(insert-citation . citar-org-insert-citation)
(alist-get 't citar-major-mode-functions)))
:bind
(("H-/" . citar-insert-citation)
:map citar-map
("c" . embark-copy-as-kill)
("u" . citar-open-links)
("s" . ebib-extras-search-dwim)
("t" . citar-extras-move-entry-to-tlon)
("b" . citar-extras-goto-bibtex-entry)
("i" . citar-extras-open-in-ebib)
:map citar-citation-map
("c" . embark-copy-as-kill)
("u" . citar-open-links)
("s" . ebib-extras-search-dwim)
("t" . citar-extras-move-entry-to-tlon)
("b" . citar-extras-goto-bibtex-entry)
("i" . citar-extras-open-in-ebib)))
```
`citar-extras` {#citar-extras}
_[citar-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/citar-extras.el) collects my extensions for `citar`._
```emacs-lisp
(use-personal-package citar-extras
:after citar
:config
;; https://github.com/emacs-citar/citar/wiki/Indicators
(setq citar-indicators
(list citar-extras-indicator-links-icons
citar-extras-indicator-files-icons
citar-extras-indicator-notes-icons
citar-extras-indicator-cited-icons)))
```
`citar-citeproc` {#citar-citeproc}
_[citar-citeproc](https://github.com/emacs-citar/citar/blob/main/citar-citeproc.el) provides Citeproc reference support for citar._
```emacs-lisp
(use-feature citar-citeproc
:after citar citeproc citar-extras tlon
:custom
(citar-citeproc-csl-styles-dir paths-dir-tlon-csl-styles)
(citar-citeproc-csl-locales-dir paths-dir-tlon-csl-locales))
```
`citar-embark` {#citar-embark}
_[citar-embark](https://github.com/emacs-citar/citar/tree/9d7088c1fe82e9cfa508ead7ef7738c732556644#embark) adds contextual access actions in the minibuffer and at-point via the citar-embark-mode minor mode._
```emacs-lisp
(use-package citar-embark
:after citar embark
:config
(citar-embark-mode))
```
`citar-org-roam` {#citar-org-roam}
_[citar-org-roam](https://github.com/emacs-citar/citar-org-roam) provides integration between citar and org-roam._
```emacs-lisp
(use-package citar-org-roam
:ensure (:host github
:repo "emacs-citar/citar-org-roam")
:after citar org-roam)
```
`org-ref` {#org-ref}
_[org-ref](https://github.com/jkitchin/org-ref) supports citations, cross-references, bibliographies in org-mode and useful bibtex tools._
I use this package only to run the cleanup function `org-ref-clean-bibtex-entry` after adding new entries to my bibliography and to occasionally call a few miscellaneous commands. I do not use any of its citation-related functionality, since I use `org-cite` for that.
```emacs-lisp
(use-package org-ref
:after zotra
:custom
(org-ref-bibtex-pdf-download-dir paths-dir-downloads)
(org-ref-insert-cite-function
(lambda ()
(org-cite-insert nil)))
:config
(dolist (fun '(org-ref-replace-nonascii
orcb-check-journal
orcb-download-pdf))
(delete fun org-ref-clean-bibtex-entry-hook)))
```
`org-ref-extras` {#org-ref-extras}
_[org-ref-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-ref-extras.el) collects my extensions for `org-ref`._
```emacs-lisp
(use-personal-package org-ref-extras
:after org-ref)
```
`ebib` {#ebib}
_[ebib](https://github.com/joostkremers/ebib) ([homepage](http://joostkremers.github.io/ebib/)) is a BibTeX database manager for Emacs._
We defer-load the package to activate the timer that in turn updates the bibliography files when Emacs is idle, like we do with `citar` above.
```emacs-lisp
(use-package ebib
:custom
(ebib-preload-bib-files paths-files-bibliography-all)
(ebib-notes-directory paths-dir-bibliographic-notes)
(ebib-notes-use-org-capture t)
(ebib-notes-display-max-lines 9999)
(ebib-filename-separator ";")
(ebib-file-associations nil) ; do not open any file types externally
(ebib-layout 'index-only)
(ebib-bibtex-dialect 'biblatex)
(ebib-use-timestamp t)
(ebib-timestamp-format "%Y-%m-%d %T (%Z)")
(ebib-default-entry-type "online")
(ebib-uniquify-keys t)
(ebib-index-columns '(("Entry Key" 30 t)
("Author/Editor" 25 t)
("Year" 4 t)
("Title" 50 t)))
(ebib-extra-fields
'((biblatex "abstract" "keywords" "origdate" "langid" "translation" "narrator" "file" "timestamp" "wordcount" "rating" "crossref" "=key=")
(bibtex "crossref" "annote" "abstract" "keywords" "file" "timestamp" "url" "doi")))
:config
(require 'tlon) ; see explanatory note under ‘reference & citation’
:hook
(ebib-entry-mode-hook . visual-line-mode)
:bind
(:map ebib-multiline-mode-map
("s-c" . ebib-quit-multiline-buffer-and-save)
:map ebib-index-mode-map
("<return>" . ebib-edit-entry)
("A" . ebib-add-entry)
("D" . ebib-delete-entry)
("f" . avy-extras-ebib-view-entry)
("k" . ebib-prev-entry)
("l" . ebib-next-entry)
("H-s" . ebib-save-current-database)
("K" . ebib-copy-key-as-kill)
("Q" . ebib-quit)
("W" . zotra-download-attachment)
:map ebib-entry-mode-map
("TAB" . ebib-goto-next-set)
("<backtab>" . ebib-goto-prev-set)
("H-s" . ebib-save-current-database)
("H-S" . ebib-save-all-databases)
("!" . ebib-generate-autokey)
("A" . ebib-add-field)
("c" . ebib-copy-current-field-contents)
("D" . ebib-delete-current-field-contents)
("E" . ebib-edit-keyname)
("H-s" . ebib-save-current-database)
("K" . ebib-copy-key-as-kill)
("Q" . ebib-quit)
("W" . zotra-download-attachment)))
```
The macro below generates the commands correctly. But attempting to define key bindings results in duplicate commands. I'm not sure what's on; it seems to be related to `use-package`.
`ebib-utils` {#ebib-utils}
_ebib-utils provides internal utility functions for [ebib](https://github.com/joostkremers/ebib)._
```emacs-lisp
(use-feature ebib-utils
:after ebib
:custom
(ebib-hidden-fields ; unhide some fields
(cl-remove-if
(lambda (el)
(member el '("edition" "isbn" "timestamp" "titleaddon" "translator")))
ebib-hidden-fields))
:config
(add-to-list 'ebib-hidden-fields "year")) ; hide others
```
`ebib-extras` {#ebib-extras}
_[ebib-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/ebib-extras.el) collects my extensions for `ebib`._
```emacs-lisp
(use-personal-package ebib-extras
:init
(advice-add 'ebib-init :after #'ebib-extras-auto-reload-databases)
:custom
(ebib-extras-attach-existing-file-action 'overwrite)
:hook
(ebib-add-entry . ebib-extras-create-list-of-existing-authors)
:bind
(("A-i" . ebib-extras-open-or-switch)
:map ebib-index-mode-map
("," . ebib-extras-prev-entry)
("." . ebib-extras-next-entry)
("d" . ebib-extras-duplicate-entry)
("n" . ebib-extras-citar-open-notes)
("A-C-s-<tab>" . ebib-extras-end-of-index-buffer)
("s" . ebib-extras-sort)
:map ebib-entry-mode-map
("s-f" . ebib-extras-open-file-dwim)
("," . ebib-extras-prev-entry)
("." . ebib-extras-next-entry)
("d" . ebib-extras-duplicate-entry)
("n" . ebib-extras-citar-open-notes)
("SPC" . ebib-extras-open-file-dwim)
("/" . ebib-extras-attach-most-recent-file)
("?" . ebib-extras-attach-file)
(";" . ebib-extras-process-entry)
("a" . ebib-extras-search-amazon)
("b" . ebib-extras-get-or-open-entry)
("g" . ebib-extras-search-library-genesis)
("G" . ebib-extras-search-goodreads)
("h" . ebib-extras-open-html-file)
("H" . ebib-extras-open-html-file-externally)
("I" . ebib-extras-set-id)
("o" . ebib-extras-search-connected-papers)
("p" . ebib-extras-open-pdf-file)
("P" . ebib-extras-open-pdf-file-externally)
("R" . ebib-extras-set-rating)
("s" . ebib-extras-search-dwim)
("T" . ebib-extras-no-translation-found)
("u" . ebib-extras-browse-url-or-doi)
("V" . ebib-extras-search-internet-archive)
("x" . ebib-extras-search-university-of-toronto)
("y" . ebib-extras-search-hathitrust)
("z" . ebib-extras-search-google-scholar)
("s-d" . ebib-extras-url-to-pdf-attach)
("s-k" . ebib-extras-fetch-keywords)
("s-h" . ebib-extras-url-to-html-attach)
("s-r" . ebib-extras-rename-files)))
```
`bib` {#bib}
_[bib](https://github.com/benthamite/bib) fetches bibliographic metadata from various APIs._
```emacs-lisp
(use-package bib
:ensure (:host github
:repo "benthamite/bib"
:depth nil) ; clone entire repo, not just last commit
:after ebib
:defer t
:custom
(bib-isbndb-key
(auth-source-pass-get "key" (concat "tlon/babel/isbndb.com/" tlon-email-shared)))
(bib-omdb-key
(auth-source-pass-get 'secret "chrome/omdbapi.com"))
(bib-tmdb-key
(auth-source-pass-get "key" "chrome/themoviedb.org/stafforini"))
:bind
(:map ebib-index-mode-map
("t" . bib-zotra-add-entry-from-title)))
```
`zotra` {#zotra}
_[zotra](https://github.com/mpedramfar/zotra) provides functions to get bibliographic information from a URL via [Zotero translators](https://www.zotero.org/support/translators), but without relying on the Zotero client._
```emacs-lisp
(use-package zotra
:ensure (:host github
:repo "mpedramfar/zotra")
:defer t
:custom
(zotra-use-curl nil)
(zotra-url-retrieve-timeout 15)
(zotra-default-entry-format "biblatex")
(zotra-download-attachment-default-directory paths-dir-downloads)
(zotra-backend 'zotra-server)
(zotra-local-server-directory (file-name-concat paths-dir-external-repos "zotra-server/")))
```
`zotra-extras` {#zotra-extras}
_[zotra-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/zotra-extras.el) collects my extensions for `zotra`._
```emacs-lisp
(use-personal-package zotra-extras
:after ebib eww
:custom
(zotra-extras-use-mullvad-p t)
:hook
(zotra-after-get-bibtex-entry-hook . zotra-extras-after-add-process-bibtex)
:bind
(:map ebib-index-mode-map
("a" . zotra-extras-add-entry))
(:map eww-mode-map
("a" . zotra-extras-add-entry)))
```
`annas-archive` {#annas-archive}
_[annas-archive](https://github.com/benthamite/annas-archive) provides rudimentary integration for Anna’s Archive, the largest existing search engine for shadow libraries._
```emacs-lisp
(use-package annas-archive
:ensure (:host github
:repo "benthamite/annas-archive")
:defer t
:init
(with-eval-after-load 'ebib
(bind-keys :map ebib-entry-mode-map
("s-a" . annas-archive-download)))
(with-eval-after-load 'eww
(bind-keys :map 'eww-mode-map
("s-a" . annas-archive-download-file)))
:custom
(annas-archive-secret-key (auth-source-pass-get 'secret "tlon/core/annas-archive"))
(annas-archive-included-file-types '("pdf"))
(annas-archive-title-column-width 130))
```
email {#email}
`simple` {#simple}
_[simple](https://github.com/emacs-mirror/emacs/blob/master/lisp/simple.el) configures the mail user agent._
```emacs-lisp
(use-feature simple
:custom
(mail-user-agent 'mu4e-user-agent)
(read-mail-command 'mu4e))
```
`sendmail` {#sendmail}
_[sendmail](https://www.gnu.org/software/emacs/manual/html_node/emacs/Mail-Sending.html) is a mode that provides mail-sending facilities from within Emacs._
```emacs-lisp
(use-feature sendmail
:after (:any mu4e org-msg)
:custom
(send-mail-function 'smtpmail-send-it))
```
`smtpmail` {#smtpmail}
_[smtpmail](https://github.com/emacs-mirror/emacs/blob/master/lisp/mail/smtpmail.el) provides SMTP mail sending support._
```emacs-lisp
(use-feature smtpmail
:after (:any mu4e org-msg)
:custom
(smtpmail-smtp-user (getenv "PERSONAL_GMAIL"))
(smtpmail-local-domain "gmail.com")
(smtpmail-default-smtp-server "smtp.gmail.com")
(smtpmail-smtp-server "smtp.gmail.com")
(smtpmail-smtp-service 465)
(smtpmail-stream-type 'ssl))
```
`message` {#message}
_[message](https://www.gnu.org/software/emacs/manual/html_mono/message.html) is a message composition mode._
```emacs-lisp
(use-feature message
:after (:any mu4e org-msg)
:demand t
:custom
(message-kill-buffer-on-exit t) ; make `message-send-and-exit' kill buffer, not bury it
(message-send-mail-function 'smtpmail-send-it)
(message-elide-ellipsis "\n> [... %l lines omitted]\n")
(message-citation-line-function 'message-insert-formatted-citation-line)
(message-citation-line-format (concat "> From: %f\n"
"> Date: %a, %e %b %Y %T %z\n"
">")
message-ignored-cited-headers "")
:config
(faces-extras-set-and-store-face-attributes
'((message-header-name :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)
(message-header-subject :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)
(message-header-to :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)
(message-header-other :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)
(message-header-cc :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)))
:hook
(message-send-hook . buffer-disable-undo) ; required to avoid an error
:bind
(:map message-mode-map
("s-a" . ml-attach-file)
("s-b" . message-goto-body)
("s-c" . message-send-and-exit)
("s-f" . message-goto-from)
("s-s" . message-goto-subject)
("s-t" . message-goto-to)
("s-A-b" . message-goto-bcc)
("s-A-c" . message-goto-cc)
("s-A-s" . message-send)))
```
`mml` {#mml}
_[mml](https://www.gnu.org/software/emacs/manual/html_node/emacs-mime/Composing.html) is a library that parses a MML (MIME Meta Language) and generates MIME messages._
```emacs-lisp
(use-feature mml
:defer t)
```
`mu4e` {#mu4e}
_[mu4e](https://github.com/djcb/mu) is an an Emacs-based e-mail client._
```emacs-lisp
(use-package mu4e
:ensure (:host github
:files ("mu4e/*.el" "build/mu4e/mu4e-meta.el" "build/mu4e/mu4e-config.el" "build/mu4e/mu4e.info")
:repo "djcb/mu"
:main "mu4e/mu4e.el"
:pre-build (("./autogen.sh") ("ninja" "-C" "build"))
:build (:not elpaca-build-docs)
:ref "1a501281" ; v.1.12.13
:depth nil)
:defer 30
:custom
;; uncomment thw two user options below when debugging
;; (mu4e-debug t)
;; (mu4e-index-update-error-warning )
(mu4e-split-view 'single-window)
(mu4e-headers-show-target nil)
(mu4e-get-mail-command "sh $HOME/bin/mbsync-parallel")
(mu4e-update-interval (* 5 60))
(mu4e-drafts-folder "/Drafts")
(mu4e-sent-folder "/Sent")
(mu4e-refile-folder "/Refiled")
(mu4e-trash-folder "/Trash")
(mu4e-attachment-dir paths-dir-downloads)
(mu4e-change-filenames-when-moving t)
;; see also `mu4e-extras-set-shortcuts'
(mu4e-maildir-shortcuts
`((:maildir ,mu4e-drafts-folder :key ?d)
(:maildir ,mu4e-sent-folder :key ?t)
(:maildir ,mu4e-refile-folder :key ?r)
(:maildir ,mu4e-trash-folder :key ?x)))
(mu4e-compose-format-flowed t)
(mu4e-confirm-quit nil)
(mu4e-headers-date-format "%Y-%m-%d %H:%M")
(mu4e-search-include-related nil)
(mu4e-search-results-limit 1000)
(mu4e-headers-visible-lines 25)
(mu4e-hide-index-messages t)
(mu4e-sent-messages-behavior 'delete) ; Gmail already keeps a copy
;; performance improvements (but with downsides)
;; groups.google.com/g/mu-discuss/c/hRRNhM5mwr0
;; djcbsoftware.nl/code/mu/mu4e/Retrieval-and-indexing.html
(mu4e-index-cleanup t) ; nil improves performance but causes stale index errors
(mu4e-index-lazy-check t) ; t improves performance
(mu4e-compose-context-policy 'ask)
(mu4e-context-policy nil)
(mu4e-modeline-support nil)
(mu4e-headers-fields '((:human-date . 16)
(:from . 30)
(:subject)))
:config
(require 'mu4e-contrib)
(setf (alist-get 'trash mu4e-marks)
'(:char ("d" . "▼")
:prompt "dtrash"
:dyn-target (lambda (target msg) (mu4e-get-trash-folder msg))
;; Here's the main difference to the regular trash mark, no +T
;; before -N so the message is not marked as IMAP-deleted:
:action (lambda (docid msg target)
(mu4e--server-move docid (mu4e--mark-check-target target) "+S-u-N"))))
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'mu4e--search-hist))
;; do not override native `mu4e' completion with `org-contacts' completion
(remove-hook 'mu4e-compose-mode-hook 'org-contacts-setup-completion-at-point)
(faces-extras-set-and-store-face-attributes
'((mu4e-compose-separator-face :family faces-extras-fixed-pitch-font :height faces-extras-fixed-pitch-size)))
:bind
(("A-m" . mu4e)
:map mu4e-main-mode-map
("c" . mu4e-compose-new)
("h" . mu4e-display-manual)
("j" . mu4e-search-maildir)
("u" . mu4e-update-mail-and-index)
:map mu4e-headers-mode-map
(";" . mu4e-copy-message-path)
("<" .= mu4e-headers-split-view-shrink)= ("=>" . mu4e-headers-split-view-grow)
("s-f" . mu4e-compose-forward)
("i" . mu4e-select-other-view)
("c" . mu4e-compose-new)
("*" . mu4e-headers-mark-all)
("A" . mu4e-headers-mark-all-unread-read)
("d" . mu4e-headers-mark-for-delete)
("f" . avy-extras-headers-view-message)
("k" . mu4e-headers-prev)
("l" . mu4e-headers-next)
("m" . mu4e-headers-mark-for-something)
("R" . mu4e-headers-mark-for-refile)
("V" . mu4e-headers-mark-for-move)
:map mu4e-view-mode-map
("," . shr-heading-previous)
("." . shr-heading-next)
(";" . mu4e-copy-message-path)
("<" .= mu4e-headers-split-view-shrink)= ("=>" . mu4e-headers-split-view-grow)
("s-f" . mu4e-compose-forward)
("i" . mu4e-select-other-view)
("c" . mu4e-compose-new)
("," . mu4e-view-headers-next)
("." . mu4e-view-headers-prev)
("d" . mu4e-view-mark-for-delete)
("f" . ace-link-extras-mu4e)
("L" . mu4e-view-save-attachments)
("m" . mu4e-view-mark-for-something)
("A-C-s-u" . nil)
("A-C-s-p" . nil)
("s-c" . org-extras-eww-copy-for-org-mode)
:map mu4e-compose-minor-mode-map
("E" . nil)
:map mu4e-minibuffer-search-query-map
("M-k" . previous-history-element)
("M-l" . next-history-element)
:map mu4e-search-minor-mode-map
("c" . nil)))
```
`mu4e-extras` {#mu4e-extras}
_[mu4e-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/mu4e-extras.el) collects my extensions for `mu4e`._
```emacs-lisp
(use-personal-package mu4e-extras
:after mu4e
:demand t
:custom
(mu4e-extras-inbox-folder "/Inbox")
(mu4e-extras-daily-folder "/Daily")
(mu4e-extras-epoch-inbox-folder "/epoch/Inbox")
(mu4e-extras-epoch-sent-folder "/epoch/Sent")
(mu4e-extras-epoch-drafts-folder "/epoch/Drafts")
(mu4e-extras-epoch-refiled-folder "/epoch/Refiled")
(mu4e-extras-epoch-trash-folder "/epoch/Trash")
(mu4e-extras-wide-reply t)
:config
(mu4e-extras-set-shortcuts)
(mu4e-extras-set-bookmarks)
(mu4e-extras-set-contexts)
(mu4e-extras-set-account-folders)
:hook
(mu4e-mark-execute-pre-hook . mu4e-extras-gmail-fix-flags)
(mu4e-view-mode-hook . mu4e-extras-set-face-locally)
(mu4e-update-pre-hook . mu4e-extras-set-index-params)
(mu4e-index-updated-hook . mu4e-extras-reapply-read-status)
(message-sent-hook . mu4e-extras-add-sent-to-mark-as-read-queue)
:bind
(:map mu4e-main-mode-map
("g" . mu4e-extras-compose-new-externally)
("r" . mu4e-extras-reindex-db)
:map mu4e-headers-mode-map
("$" . mu4e-extras-copy-sum)
("D" . mu4e-extras-headers-trash)
("E" . mu4e-extras-headers-mark-read-and-refile)
("X" . mu4e-extras-mark-execute-all-no-confirm)
("e" . mu4e-extras-headers-refile)
("o" . mu4e-extras-view-org-capture)
("r" . mu4e-extras-compose-reply)
("v" . mu4e-extras-headers-move)
("x" . mu4e-extras-open-gmail)
:map mu4e-view-mode-map
("$" . mu4e-extras-copy-sum)
(";" . mu4e-copy-message-path)
("D" . mu4e-extras-view-trash)
("e" . mu4e-extras-view-refile)
("o" . mu4e-extras-view-org-capture)
("r" . mu4e-extras-compose-reply)
("v" . mu4e-extras-view-move)
("x" . mu4e-extras-view-in-gmail)))
```
`org-msg` {#org-msg}
_[org-msg](https://github.com/jeremy-compostella/org-msg) is a global minor mode mixing up Org mode and Message mode to compose and reply to emails in a HTML-friendly style._
I use this package to compose messages and to reply to messages composed in HTML For plain-text messages, I use `message` (of which see above).
```emacs-lisp
(use-package org-msg
:after mu4e-extras
:demand t
:custom
(org-msg-options "html-postamble:nil H:5 num:nil ^:{} toc:nil author:nil email:nil \\n:t tex:imagemagick")
(org-msg-startup "hidestars indent inlineimages")
(org-msg-recipient-names `((,(getenv "PERSONAL_GMAIL") . "Pablo")))
(org-msg-greeting-name-limit 3)
(org-msg-default-alternatives '((new . (text html))
(reply-to-html . (text html))))
(org-msg-convert-citation t)
(org-msg-enforce-css '((del nil
((font-family . "\"Georgia\"")
(font-size . "11pt")
(color . "grey")
(border-left . "none")
(text-decoration . "line-through")
(margin-bottom . "0px")
(margin-top . "10px")
(line-height . "1.3")))
(a nil
((color . "#4078F2")))
(a reply-header
((color . "black")
(text-decoration . "none")))
(div reply-header
((padding . "3.0pt 0in 0in 0in")
(border-top . "solid #d5d5d5 1.0pt")
(margin-bottom . "20px")))
(span underline
((text-decoration . "underline")))
(li nil
((font-family . "\"Georgia\"")
(font-size . "11pt")
(line-height . "1.3")
(margin-bottom . "0px")
(margin-top . "2px")))
(nil org-ul
((list-style-type . "square")))
(nil org-ol
((font-family . "\"Georgia\"")
(font-size . "11pt")
(line-height . "1.3")
(margin-bottom . "0px")
(margin-top . "0px")
(margin-left . "30px")
(padding-top . "0px")
(padding-left . "5px")))
(nil signature
((font-family . "\"Arial\", \"Helvetica\", sans-serif")
(font-size . "11pt")
(margin-bottom . "20px")))
(blockquote quote0
((padding-left . "5px")
(font-size . "0.9")
(margin-left . "10px")
(margin-top . "10px")
(margin-bottom . "0")
(background . "#f9f9f9")
(border-left . "3px solid #d5d5d5")))
(blockquote quote1
((padding-left . "5px")
(font-size . "0.9")
(margin-left . "10px")
(margin-top . "10px")
(margin-bottom . "0")
(background . "#f9f9f9")
(color . "#324e72")
(border-left . "3px solid #3c5d88")))
(pre nil
((line-height . "1.3")
(color . "#000000")
(background-color . "#f0f0f0")
(margin . "0px")
(font-size . "9pt")
(font-family . "monospace")))
(p nil
((text-decoration . "none")
(margin-bottom . "0px")
(margin-top . "10px")
(line-height . "1.3")
(font-size . "11pt")
(font-family . "\"Georgia\"")))
(div nil
((font-family . "\"Georgia\"")
(font-size . "11pt")
(line-height . "11pt")))))
:config
(org-msg-mode)
(require 'pass)
:hook
(org-msg-mode-hook . (lambda () (auto-fill-mode -1)))
:bind
(:map org-msg-edit-mode-map
("s-A-b" . message-goto-bcc)
("s-A-c" . message-goto-cc)
("s-A-s" . message-send)
("s-a" . org-msg-attach)
("s-b" . org-msg-extras-begin-compose)
("s-c" . message-send-and-exit)
("s-f" . message-goto-from)
("s-k" . org-insert-link)
("s-s" . message-goto-subject)
("s-t" . message-goto-to)
("s-A-l" . org-extras-url-dwim)))
```
`org-msg-extras` {#org-msg-extras}
_[org-msg-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-msg-extras.el) collects my extensions for `org-msg`._
```emacs-lisp
(use-personal-package org-msg-extras
:after org-msg
:demand t
:hook
(org-msg-edit-mode-hook . org-msg-extras-fold-signature-blocks)
:bind
(:map org-msg-edit-mode-map
("s-g" . org-msg-extras-open-in-grammarly)
("s-x" . org-msg-extras-kill-message)))
```
messaging {#messaging}
:LOGBOOK:
nil:END:
`telega` {#telega}
_[telega](https://github.com/zevlg/telega.el) is an unofficial Emacs Telegram client._
To upgrade TDLib with homebrew, run `brew upgrade tdlib --fetch-HEAD` in a terminal, then `M-x telega-server-build`.
If you need to install an earlier version than `HEAD`, you’ll need to build from source:
1. Clone the repo: `git clone https://github.com/tdlib/td.git`
2. Navigate to the `td` directory: `cd td`
3. Checkout the desired version: `git checkout<version_tag>`
4. Create a build directory: `mkdir build && cd build && cmake ../`
5. Build the library: `make -jN`, replacing `N` by the number of cores to be used for the compilation.
6. Install the library: `make install`
Then change the value of `telega-server-libs-prefix` from `/opt/homebrew` to `/usr/local/lib`.
```emacs-lisp
(use-package telega
:init
(setopt telega-server-libs-prefix "/opt/homebrew")
:custom
(telega-chat-input-markups '("org"))
(telega-use-images t)
(telega-emoji-font-family 'noto-emoji)
(telega-emoji-use-images nil)
(telega-filters-custom '(("Main" . main)
("Important" or mention
(and unread unmuted))
("Archive" . archive)
("Online" and
(not saved-messages) (user is-online))
("Groups" type basicgroup supergroup)
("Channels" type channel)))
(telega-completing-read-function 'completing-read)
(telega-webpage-history-max most-positive-fixnum)
(telega-root-fill-column 110)
(telega-chat-fill-column 90)
(telega-webpage-fill-column 110)
(telega-photo-size-limits `(8 3 ,(* 55 1.5) ,(* 12 1.5)))
(telega-webpage-photo-size-limits `(55 10 ,(* 110 1.5) ,(* 20 1.5)))
(telega-mode-line-format nil)
(telega-vvnote-play-speeds '(1 1.5 2))
:config
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'telega-search-history))
(telega-mode-line-mode)
(faces-extras-set-and-store-face-attributes
'((telega-entity-type-code :family faces-extras-fixed-pitch-font :height
(face-attribute 'default :height))))
:hook
(telega-chat-mode-hook . (lambda () (setq default-directory paths-dir-downloads)))
:bind
(:map telega-chat-mode-map
("M-p" . nil)
("S-<return>" . newline)
("A-s-r" . telega-chatbuf-next-unread-reaction)
("A-C-s-f" . telega-msg-next)
("A-C-s-r" . telega-msg-previous)
("<return>" . telega-extras-smart-enter)
;; if point is on a URL, `telega-msg-button-map' ceases to be
;; active and `<return>' triggers `newline' rather than
;; `push-button'. this seems to be a bug. as a workaround, we also
;; bind `push-button' to `s-<return>' in `telega-chat-mode-map'.
("s-," . telega-chatbuf-goto-pinned-message)
("s-a" . telega-chatbuf-attach)
("s-c" . telega-mnz-chatbuf-attach-code)
("s-d" . telega-chatbuf-goto-date)
("s-f" . telega-chatbuf-filter)
("s-k" . org-insert-link)
("s-m" . telega-chatbuf-attach-media)
("s-o" . telega-chatbuf-filter-by-topic)
("s-r" . telega-msg-add-reaction)
("s-s" . telega-chatbuf-filter-search)
("s-t" . telega-sticker-choose-favorite-or-recent)
("s-v" . org-extras-paste-with-conversion)
("M-s-v" . telega-chatbuf-attach-clipboard)
("s-z" . telega-mnz-chatbuf-attach-code)
("M-s-e" . telega-chatbuf-edit-prev)
("M-s-v" . telega-chatbuf-attach-clipboard)
("M-f" . ace-link-org)
:map telega-msg-button-map
("k" . telega-button-backward)
("l" . telega-button-forward)
("<return>" . telega-extras-smart-enter)
("," . telega-chatbuf-goto-pinned-message)
("C" . telega-msg-copy-link)
("D" . telega-msg-delete-dwim)
("F" . telega-msg-forward-dwim)
("f" . ace-link-org)
("s" . telega-chatbuf-filter-search)
("w" . telega-browse-url)
("W" . telega-chatbuf-filter-cancel)
:map telega-chat-button-map
("a" . nil)
("o" . nil)
:map telega-root-mode-map
("k" . telega-button-backward)
("<up>" . telega-button-backward)
("l" . telega-button-forward)
("<down>" . telega-button-forward)
("SPC" . telega-root-next-unread)
("." . telega-chat-with)
("a" . telega-chat-toggle-archive)
("f" . avy-extras-telega-view-message)
("m" . telega-chat-toggle-muted)
:map telega-webpage-mode-map
("x" . telega-webpage-browse-url)))
```
`telega-mnz` {#telega-mnz}
_[telega-mnz](https://github.com/zevlg/telega.el/blob/master/contrib/telega-mnz.el) displays syntax highlighting in Telega code blocks._
```emacs-lisp
(use-feature telega-mnz
:after telega
:demand t
:custom
(telega-mnz-use-language-detection nil)
:hook
(telega-load-hook . global-telega-mnz-mode))
```
`telega-dired-dwim` {#telega-dired-dwim}
_[telega-dired-dwim](https://github.com/zevlg/telega.el/blob/master/contrib/telega-dired-dwim.el) enables Dired file attachments in Telega chat buffers._
```emacs-lisp
(use-feature telega-dired-dwim
:after telega dired)
```
`telega-extras` {#telega-extras}
_[telega-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/telega-extras.el) collects my extensions for `telega`._
```emacs-lisp
(use-personal-package telega-extras
:hook
(telega-load-hook . telega-extras-reset-tab-bar)
(telega-chat-post-message-hook . telega-extras-transcribe-audio)
:bind
(("A-l" . telega-extras-switch-to)
:map telega-msg-button-map
("o" . telega-extras-chat-org-capture)
("." . telega-extras-docs-change-open)
("b" . telega-extras-transcribe-audio)
("d" . telega-extras-download-file)
("L" . telega-extras-chat-org-capture-leo)
:map telega-chat-mode-map
("M-s-t" . telega-extras-chatbuf-attach-most-recent-file)
:map dired-mode-map
("M-s-a" . telega-extras-dired-attach-send)
:map telega-root-view-map
("a" . telega-extras-view-archive)
("m" . telega-extras-view-main)
:map telega-root-mode-map
("o" . telega-extras-chat-org-capture)))
```
`ol-telega` {#ol-telega}
_[ol-telega](https://github.com/zevlg/telega.el/blob/master/contrib/ol-telega.el) enables Org mode links to Telega chats and messages._
```emacs-lisp
(use-feature ol-telega
:after telega)
```
sgn {#sgn}
_[sgn](https://github.com/benthamite/sgn) is an Emacs interface for Signal._
```emacs-lisp
(use-package sgn
:ensure (:host github
:repo "benthamite/sgn")
:custom
(sgn-account "+14246668293"))
```
`wasabi` {#wasabi}
_[wasabi](https://github.com/xenodium/wasabi/) is a WhatsApp Emacs client powered by wuzapi and whatsmeow._
```emacs-lisp
(use-package wasabi
:ensure (:host github
:repo "xenodium/wasabi")
:defer t)
```
`ement` {#ement}
_[ement](https://github.com/alphapapa/ement.el) is a Matrix client for Emacs._
```emacs-lisp
(use-package ement
:disabled)
```
I installed `ement` in the hopes that I would be able to send Signal and WhatsApp messages from Emacs. I tried the [two way verification](https://github.com/alphapapa/ement.el/blob/6412c8aaae29ee79ccfb44582001c12d147cd5a6/e2ee.org#two-way-verification) method but calling `panctl` results in a `dbus`-related error, and I was unable to make `dubs` work. Some discussion [here](https://www.reddit.com/r/emacs/comments/1crerbh/comment/l40aszc/) (see also [this comment](https://www.reddit.com/r/emacs/comments/1crerbh/comment/l40prh2/)).
`erc` {#erc}
_[erc](https://www.gnu.org/software/emacs/manual/html_mono/erc.html) is an IRC client for Emacs._
```emacs-lisp
(use-feature erc
:after auth-source-pass
:defer t
:custom
(erc-server "irc.libera.chat")
(erc-user-full-name user-full-name)
(erc-nick (auth-source-pass-get "user" "auth-sources/erc/libera"))
(erc-password (auth-source-pass-get 'secret "auth-sources/erc/libera"))
(erc-prompt-for-nickserv-password nil)
;; erc-track-shorten-start 8 ; characters to display in modeline
(erc-autojoin-channels-alist '(("irc.libera.chat")))
(erc-kill-buffer-on-part nil)
(erc-auto-query t)
:config
(add-to-list 'erc-modules 'notifications)
(add-to-list 'erc-modules 'spelling))
```
`circe` {#circe}
_[circe](https://github.com/emacs-circe/circe) is another IRC client for Emacs._
```emacs-lisp
(use-package circe
:defer t)
```
`slack` {#slack}
_[slack](https://github.com/yuya373/emacs-slack) is a Slack client for Emacs._
```emacs-lisp
(use-package slack
:ensure (:host github
:repo "benthamite/emacs-slack")
:custom
(slack-file-dir paths-dir-downloads)
(slack-prefer-current-team t)
(slack-buffer-emojify t)
(slack-message-custom-notifier 'ignore)
:config
(require 'pass)
(slack-register-team
:name "Epoch AI"
:token (auth-source-pass-get "token" "epoch/slack.com/epochai")
:cookie (auth-source-pass-get "cookie" "epoch/slack.com/epochai")
:animate-image t)
:hook
(slack-buffer-mode-hook . (lambda () (setopt line-spacing nil)))
:bind
(("A-k" . slack-menu)
:map slack-mode-map
("s-a" . slack-all-threads)
("s-c" . slack-channel-select)
("s-g" . slack-group-select)
("s-m" . slack-im-select)
("H-s-t" . slack-change-current-team)
("s-u" . slack-select-rooms)
("H-s-u" . slack-select-unread-rooms)
:map slack-buffer-mode-map
("s-a" . slack-all-threads)
("s-c" . slack-channel-select)
("s-g" . slack-group-select)
("s-m" . slack-im-select)
("H-s-t" . slack-change-current-team)
("s-u" . slack-select-rooms)
("H-s-u" . slack-select-unread-rooms) ; `slack-all-unreads' not working
:map slack-thread-message-buffer-mode-map
("d" . slack-thread-show-or-create)
("e" . slack-message-edit)
("o" . slack-chat-org-capture)
("q" . files-extras-kill-other-buffer)
("r" . slack-thread-reply)
("R" . slack-message-add-reaction)
("z" . slack-message-write-another-buffer)
("s-z" . slack-message-write-another-buffer)
:map slack-activity-feed-buffer-mode-map
("," . slack-activity-feed-goto-prev)
("." . slack-activity-feed-goto-next)
("d" . slack-thread-show-or-create)
("e" . slack-message-edit)
("k" . slack-feed-goto-prev)
("l" . slack-feed-goto-next)
("r" . slack-thread-reply)
("R" . slack-message-add-reaction)
("z" . slack-message-write-another-buffer)
("s-z" . slack-message-write-another-buffer)
:map slack-message-buffer-mode-map
("," . slack-buffer-goto-prev-message)
("." . slack-buffer-goto-next-message)
("d" . slack-thread-show-or-create)
("e" . slack-message-edit)
("k" . slack-buffer-goto-prev-message)
("l" . slack-buffer-goto-next-message)
("o" . slack-chat-org-capture)
("r" . slack-thread-reply)
("R" . slack-message-add-reaction)
("z" . slack-message-write-another-buffer)
("s-z" . slack-message-write-another-buffer)
:map slack-message-compose-buffer-mode-map
("s-c" . slack-message-send-from-buffer)
("s-f" . slack-message-select-file)
("s-m" . slack-message-embed-mention)))
```
`slack-extras` {#slack-extras}
_[slack-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/slack-extras.el) collects my extensions for `slack`._
```emacs-lisp
(use-personal-package slack-extras
:after slack
:demand t
:bind
(:map slack-activity-feed-buffer-mode-map
("E" . slack-extras-work-capture)
("o" . slack-extras-personal-capture)
:map slack-thread-message-buffer-mode-map
("E" . slack-extras-work-capture)
("o" . slack-extras-personal-capture)
:map slack-message-buffer-mode-map
("E" . slack-extras-work-capture)
("o" . slack-extras-personal-capture)))
```
`ol-emacs-slack` {#ol-emacs-slack}
_[ol-emacs-slack](https://github.com/ag91/ol-emacs-slack) provides `org-store-link` support for `slack`._
```emacs-lisp
(use-package ol-emacs-slack
:ensure (:host github
:repo "benthamite/ol-emacs-slack")
:after slack)
```
web {#web}
- [Emacs-focused Web Browsing](http://www.howardism.org/Technical/Emacs/browsing-in-emacs.html)
- [EWW and my extras (text-based Emacs web browser) | Protesilaos Stavrou](https://protesilaos.com/codelog/2021-03-25-emacs-eww/)
`browse-url` {#browse-url}
_browse-url provides functions for browsing URLs._
```emacs-lisp
(use-feature browse-url
:defer 30
:custom
(browse-url-browser-function 'eww-browse-url)
(browse-url-firefox-program "/Applications/Firefox.app/Contents/MacOS/firefox")
(browse-url-chrome-program "/Applications/Google Chrome.app/Contents/MacOS/Google Chrome"))
```
`browse-url-extras` {#browse-url-extras}
_[browse-url-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/browse-url-extras.el) collects my extensions for `browse-url`._
```emacs-lisp
(use-personal-package browse-url-extras
:init
(with-eval-after-load 'xwidget
(bind-keys :map 'xwidget-webkit-mode-map
("X" . browse-extras-browse-url-externally)))
:after browse-url)
```
`shr` {#shr}
_shr is a simple HTML renderer._
```emacs-lisp
(use-feature shr
:after faces-extras
:defer t
:custom
(shr-bullet "• ")
(shr-use-colors nil)
(shr-use-fonts t)
(shr-image-animate nil)
(shr-width nil)
(shr-max-width 100)
(shr-discard-aria-hidden t)
(shr-cookie-policy t)
:config
(faces-extras-set-and-store-face-attributes
'((shr-text :height 0.65)
;; doesn’t seem to be working?
(shr-h1 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height)
(shr-h2 :family faces-extras-fixed-pitch-font :height faces-extras-org-level-height))))
```
`html` {#html}
_[html](https://github.com/emacs-mirror/emacs/blob/master/lisp/textmodes/sgml-mode.el) provides a major mode for editing HTML files._
```emacs-lisp
(use-feature html
:bind (:map html-mode-map
("s-w" . eww-extras-browse-file)))
```
`mhtml` {#mhtml}
_mhtml is an editing mode that handles CSS and JavaScript._
```emacs-lisp
(use-feature mhtml
:bind
(:map mhtml-mode
("s-x" . browse-url-of-buffer)
("s-w" . eww-extras-browse-file)))
```
`shr-tag-pre-highlight` {#shr-tag-pre-highlight}
_[shr-tag-pre-highlight](https://github.com/xuchunyang/shr-tag-pre-highlight.el) adds syntax highlighting for code blocks in HTML rendered by `shr`._
```emacs-lisp
(use-package shr-tag-pre-highlight
:after (:any eww elfeed)
:config
(add-to-list 'shr-external-rendering-functions
'(pre . shr-tag-pre-highlight)))
```
`shr-heading` {#shr-heading}
_[shr-heading](https://github.com/oantolin/emacs-config/blob/master/my-lisp/shr-heading.el) supports heading navigation for shr-rendered buffers._
Discussion [here](https://www.reddit.com/r/emacs/comments/u234pn/comment/i4i3gqg/?utm_source=share&utm_medium=web2x&context=3).
```emacs-lisp
(use-package shr-heading
:ensure (:host github
:repo "oantolin/emacs-config"
:files ("my-lisp/shr-heading.el"))
:after shr
:hook
(eww-mode-hook . shr-heading-setup-imenu))
```
`eww` {#eww}
_eww is a text-based web browser._
```emacs-lisp
(use-feature eww
:after simple-extras
:custom
(eww-search-prefix "https://duckduckgo.com/?t=h_&q=")
(eww-restore-desktop t)
(eww-desktop-remove-duplicates t)
(eww-header-line-format nil)
(eww-download-directory paths-dir-downloads)
(eww-auto-rename-buffer 'title)
(eww-suggest-uris
'(eww-links-at-point
thing-at-point-url-at-point))
(eww-history-limit most-positive-fixnum)
(eww-browse-url-new-window-is-tab nil)
;; make eww respect url handlers when following links in webpages
:config
(dolist (cons browse-url-handlers)
(setopt eww-use-browse-url
(concat eww-use-browse-url "\\|" (car cons))))
(with-eval-after-load 'savehist
(dolist (var '(eww-history eww-prompt-history))
(add-to-list 'savehist-additional-variables var)))
:bind
(("A-w" . eww)
:map eww-mode-map
("<return>" . eww-follow-link)
("S-<return>" . eww-follow-link)
("," . shr-heading-previous)
("." . shr-heading-next)
("[" . eww-previous-url)
("]" . eww-next-url)
("j" . eww-back-url)
(";" . eww-forward-url)
("e" . browse-url-extras-add-domain-to-open-externally)
("f" . ace-link-eww)
("F" . ace-link-extras-eww-new-buffer)
("s-f" . ace-link-extras-eww-externally)
("g" . nil)
("o" . eww-toggle-fonts)
("r" . eww-reload)
;; ":" (lambda! (eww-follow-link '(4)))
("X" . eww-browse-with-external-browser)
("s-c" . org-extras-eww-copy-for-org-mode)))
```
`eww-extras` {#eww-extras}
_[eww-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/eww-extras.el) collects my extensions for `eww`._
```emacs-lisp
(use-personal-package eww-extras
:after eww
:demand t
:config
(require 'xwidget)
(advice-add 'eww :before #'eww-extras-browse-youtube)
:bind
(:map eww-mode-map
("g e" . eww-extras-edit-current-url)
("g u" . eww-extras-go-up-url-hierarchy)
("g U" . eww-extras-go-to-root-url-hierarchy)
;; "p" 'eww-extras-open-with-recent-kill-ring
("h" . eww-extras-url-to-html)
("p" . eww-extras-url-to-pdf)
("x" . eww-extras-open-with-xwidget)
("s-d" . eww-extras-url-to-pdf)
("s-h" . eww-extras-url-to-html)
:map xwidget-webkit-mode-map
("x" . eww-extras-open-with-eww)))
```
`prot-eww` {#prot-eww}
_[[<https://github.com/protesilaos/dotfiles/blob/master/emacs>_.emacs.d/prot-lisp/prot-eww.el][prot-eww]] is a set of `eww` extensions from Protesilaos Stavrou's personal configuration._
Note Prot's clarification:
Remember that every piece of Elisp that I write is for my own educational and recreational purposes. I am not a programmer and I do not recommend that you copy any of this if you are not certain of what it does.
```emacs-lisp
(use-package prot-eww
:ensure (:host github
:repo "protesilaos/dotfiles"
:local-repo "prot-eww"
:main "emacs/.emacs.d/prot-lisp/prot-eww.el"
:build (:not elpaca-check-version)
:files ("emacs/.emacs.d/prot-lisp/prot-eww.el"))
:after eww prot-common
:bind
(:map eww-mode-map
("M-f" . prot-eww-visit-url-on-page)
("A-M-f" . prot-eww-jump-to-url-on-page)))
```
`w3m` {#w3m}
_[w3m](https://github.com/emacs-w3m/emacs-w3m) is an Emacs interface to w3m._
I only use `w3m` to browse HTML email messages with `mu4e`. For web browsing, I use `eww`.
```emacs-lisp
(use-package w3m
:after simple mu4e
:bind
(:map w3m-minor-mode-map
("<left>" . left-char)
("<right>" . right-char)
("<up>" . previous-line)
("<down>" . next-line)
:map w3m-mode-map
("s-<return>" . w3m-view-url-with-browse-url)
:map mu4e-view-mode-map
("<return>" . w3m-view-url-with-browse-url)))
```
`elfeed` {#elfeed}
_[elfeed](https://github.com/skeeto/elfeed) is a web feeds client._
If the lines are breaking at the wrong places, set `shr-width` to the right value.
```emacs-lisp
(use-package elfeed
:ensure (:host github
:repo "benthamite/elfeed"
:branch "debounce-search-update") ; https://github.com/skeeto/elfeed/pull/558
:init
(run-with-idle-timer (* 5 60) t #'elfeed-extras-update)
:custom
(elfeed-curl-timeout 5)
(elfeed-curl-max-connections 4)
(elfeed-search-remain-on-entry t)
:config
(setq-default elfeed-search-filter "@15-days-ago +unread")
:hook
(elfeed-show-mode-hook . visual-line-mode)
(elfeed-search-mode-hook . (lambda ()
"Disable undo in ‘*elfeed-search*’ buffer to avoid warnings."
(buffer-disable-undo)))
:bind
(("A-f" . elfeed)
:map eww-mode-map
("c" . elfeed-kill-link-url-at-point)
:map elfeed-search-mode-map
("U" . elfeed-search-tag-all-unread)
("d" . elfeed-update)
("f" . avy-extras-elfeed-search-show-entry)
("j" . elfeed-unjam)
("o" . elfeed-org)
("s" . elfeed-search-live-filter)
:map elfeed-show-mode-map
(";" . elfeed-show-next)
("<return>" . eww-follow-link)
("," . shr-heading-previous)
("." . shr-heading-next)
("F" . ace-link-extras-eww-new-buffer)
("S-<return>" . eww-follow-link)
("a" . zotra-extras-add-entry)
("b" . nil)
("f" . ace-link-eww)
("j" . elfeed-show-prev)
("q" . files-extras-kill-this-buffer)
("q" . nil)
("s-f" . ace-link-extras-eww-externally)
("x" . elfeed-show-visit)))
```
`elfeed-extras` {#elfeed-extras}
_[elfeed-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/elfeed-extras.el) collects my extensions for `elfeed`._
```emacs-lisp
(use-personal-package elfeed-extras
:after elfeed
:demand t
:custom
(elfeed-show-entry-switch #'elfeed-extras-display-buffer)
:hook
(elfeed-search-mode-hook . elfeed-extras-disable-undo)
:bind
(:map elfeed-search-mode-map
("A" . elfeed-extras-mark-all-as-read)
("e" . elfeed-extras-toggle-read-entries)
("k" . elfeed-extras-follow-previous)
("l" . elfeed-extras-follow-next)
("w" . elfeed-extras-toggle-wiki-entries)
:map elfeed-show-mode-map
("<tab>" . elfeed-extras-jump-to-next-link)
("i" . elfeed-extras-toggle-fixed-pitch)
("w" . elfeed-extras-kill-link-url-of-entry)))
```
`elfeed-org` {#elfeed-org}
_[elfeed-org](https://github.com/remyhonig/elfeed-org) supports defining the feeds used by elfeed in an org-mode file._
```emacs-lisp
(use-package elfeed-org
:after elfeed
:custom
(rmh-elfeed-org-files (list paths-file-feeds-pablo))
:config
(elfeed-org))
```
`elfeed-tube` {#elfeed-tube}
_[elfeed-tube](https://github.com/karthink/elfeed-tube) integrates `elfeed` with YouTube._
```emacs-lisp
(use-package elfeed-tube
:after elfeed
:demand t
:custom
(elfeed-tube-auto-save-p t)
:config
(push '(text . shr-text) elfeed-tube-captions-faces)
(elfeed-tube-setup)
:bind
(:map elfeed-show-mode-map
("v" . elfeed-tube-mpv)
("F" . elfeed-tube-mpv-follow-mode)
("." . elfeed-tube-mpv-where)))
```
`elfeed-tube-mpv` {#elfeed-tube-mpv}
_[elfeed-tube-mpv](https://github.com/karthink/elfeed-tube) integrates `elfeed-tube` with `mpv`._
```emacs-lisp
(use-package elfeed-tube-mpv
:after elfeed-tube
:demand t
:custom
(elfeed-tube-save-indicator t))
```
`elfeed-ai` {#elfeed-ai}
_[elfeed-ai](https://github.com/benthamite/elfeed-ai/) provides AI-powered content curation for `elfeed`._
```emacs-lisp
(use-package elfeed-ai
:ensure (:host github
:repo "benthamite/elfeed-ai")
:after elfeed
:demand t
:custom
(elfeed-ai-model 'gemini-flash-lite-latest)
(elfeed-ai-interest-profile (file-name-concat paths-dir-notes "my-reading-taste-profile-summary.org"))
:config
(elfeed-ai-mode)
:bind
(:map elfeed-search-mode-map
("a" . elfeed-ai-menu)
("t" . elfeed-ai-toggle-sort)))
```
`engine-mode` {#engine-mode}
_[engine-mode](https://github.com/hrs/engine-mode) is a minor mode for defining and querying search engines through Emacs._
```emacs-lisp
(use-package engine-mode
:defer t
:custom
(engine/browser-function browse-url-browser-function)
:config
(engine/set-keymap-prefix (kbd "H-g"))
(defengine AllMusic
"http://www.allmusic.com/search/all/%s"
:keybinding "a m")
(defengine Alignment-Forum
"https://www.alignmentforum.org/search?query=%s"
:keybinding "a f")
(defengine AlternativeTo
"http://alternativeto.net/SearchResult.aspx?profile=all&search=%s"
:keybinding "a t")
(defengine Amazon-DE
"http://www.amazon.de/s?k=%s"
:keybinding "a d")
(defengine Amazon-ES
"http://www.amazon.es/s?k=%s"
:keybinding "a e")
(defengine Amazon-FR
"https://www.amazon.fr/s?k=%s"
:keybinding "a f")
(defengine TheresAnAIForThat
"https://theresanaiforthat.com/s/%s/"
:keybinding "a i")
(defengine Amazon-MX
"https://www.amazon.com.mx/s?k=%s"
:keybinding "a x")
(defengine Amazon-UK
"http://www.amazon.co.uk/s?k=%s"
:keybinding "a k")
(defengine Amazon-US
"http://www.amazon.com/s?k=%s"
:keybinding "a a")
(defengine AnkiWeb
"https://ankiweb.net/shared/decks/%s"
:keybinding "a w")
(defengine AstralCodexTen
"https://substack.com/search/%s?focusedPublicationId=89120"
:keybinding "a c"
;; individual Substack posts render nicely in eww, but for other pages we need a modern browser
:browser 'browse-url-default-browser)
(defengine Audible
"https://www.audible.com/search/ref=a_hp_tseft?advsearchKeywords=%s&filterby=field-keywords&x=13&y=11"
:keybinding "a u")
(defengine AudioBookBay
"https://audiobookbay.lu/?s=%s&tt=1"
:keybinding "a b")
(defengine EABlogs
"https://cse.google.com/cse?cx=013594344773078830993:k3igzr2se6y&q=%s"
:keybinding "b b")
(defengine BookFinder
"http://www.bookfinder.com/search/?keywords=%s&st=xl&ac=qr&src=opensearch"
:keybinding "b f")
(defengine Bing
"https://www.bing.com/search?q=%s&PC=U316&FORM=CHROMN"
:keybinding "b i")
(defengine UCBerkeleyLibrary "https://search.library.berkeley.edu/discovery/search?query=any,contains,%s&tab=Default_UCLibrarySearch&search_scope=DN_and_CI&vid=01UCS_BER:UCB&offset=0"
:keybinding "b l")
(defengine MercadoLibre
"https://listado.mercadolibre.com.ar/%s#D[A:qwer]"
:keybinding "c c")
(defengine CRSocietyForums
"https://www.crsociety.org/search/?q=%s"
:keybinding "c r")
(defengine Calendly
"https://calendly.com/app/login?email=%s&lang=en"
:keybinding "c l")
(defengine ChromeExtensions
"https://chrome.google.com/webstore/search/%s?_category=extensions"
:keybinding "c e")
(defengine Crossref
"https://search.crossref.org/?q=%s"
:keybinding "c r")
(defengine MercadoLibreUsed
"https://listado.mercadolibre.com.ar/%s_ITEM*CONDITION_2230581_NoIndex_True#applied_filter_id%3DITEM_CONDITION%26applied_filter_name%3DCondici%C3%B3n%26applied_filter_order%3D10%26applied_value_id%3D2230581%26applied_value_name%3DUsado%26applied_value_order%3D2%26applied_value_results%3D9%26is_custom%3Dfalse"
:keybinding "c r")
(defengine DOI
"https://doi.org/%s"
:keybinding "d o")
(defengine DuckDuckGo
"https://duckduckgo.com/?q=%s"
:keybinding "d d")
(defengine Diccionario-Panhispánico-de-Dudas
"https://www.rae.es/dpd/%s"
:keybinding "d p")
(defengine EAForum
"https://www.google.com/search?q=%s+site:forum.effectivealtruism.org"
:keybinding "f f")
(defengine Ebay-UK
"https://www.ebay.co.uk/sch/i.html?_from=R40&_trksid=p2380057.m570.l1313&_nkw=%s&_sacat=0"
:keybinding "e k")
(defengine Ebay-US
"https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2380057.m570.l1313&_nkw=%s&_sacat=0"
:keybinding "e b")
(defengine Ebay-DE
"https://www.ebay.de/sch/i.html?_from=R40&_trksid=p2380057.m570.l1313&_nkw=%s&_sacat=0"
:keybinding "e d")
(defengine Fundeu
"https://cse.google.com/cse?cx=005053095451413799011:alg8dd3pluq&q=%s"
:keybinding "f f")
(defengine Flickr
"http://www.flickr.com/search/?q=%s"
:keybinding "f l")
(defengine Financial-Times
"https://www.ft.com/search?q=%s"
:keybinding "f t")
(defengine GitHub
"https://github.com/search?q=%s&type=code"
:keybinding "g h")
(defengine Goodreads
"http://www.goodreads.com/search/search?search_type=books&search[query]=%s"
:keybinding "g r")
(defengine Google
"https://www.google.com/search?q=%s"
:keybinding "g g")
(defengine Google-Books
"https://www.google.com/search?q=%s&btnG=Search+Books&tbm=bks&tbo=1&gws_rd=ssl"
:keybinding "g k")
(defengine Google-Custom-Search
"https://cse.google.com/cse?cx=013594344773078830993:bg9mrnfwe30&q=%s"
:keybinding "g c")
(defengine Google-Domains
"https://domains.google.com/registrar?s=%s&hl=en"
:keybinding "g d")
(defengine Google-Drive
"https://drive.google.com/drive/u/0/search?q=%s"
:keybinding "g d")
(defengine Google-Trends
"http://www.google.com/trends/explore#q=%s"
:keybinding "g e")
(defengine Google-Images
"https://www.google.com/search?tbm=isch&source=hp&biw=1920&bih=1006&ei=2PlgWp_OEcHF6QTo2b2ACQ&q=%s"
:keybinding "g i")
(defengine Google-Maps
"https://www.google.com/maps/search/%s"
:keybinding "g m")
(defengine Google-News
"https://news.google.com/search?q=%s"
:keybinding "g n")
(defengine Google-Podcasts
"https://podcasts.google.com/?q=%s"
:keybinding "g o")
(defengine Google-Photos
"https://photos.google.com/search/%s"
:keybinding "g p")
(defengine Google-Scholar
"https://scholar.google.com/scholar?hl=en&as_sdt=1%2C5&q=%s&btnG=&lr="
:keybinding "s s")
(defengine Google-Translate
"https://translate.google.com/#auto/en/%s"
:keybinding "g t")
(defengine Google-Video
"https://www.google.com/search?q=%s&tbm=vid"
:keybinding "g v")
(defengine GiveWell
"https://www.givewell.org/search/ss360/%s"
:keybinding "g w")
(defengine Google-Play
"https://play.google.com/store/search?q=%s"
:keybinding "g y")
(defengine Google-Scholar-Spanish
"https://scholar.google.com/scholar?hl=en&as_sdt=1%2C5&q=%s&btnG="
:keybinding "s x")
(defengine Gwern
"https://www.google.com/search?q=%s+site:gwern.net"
:keybinding "g w")
(defengine IMDb
"https://www.imdb.com/find/?q=%s"
:keybinding "i i")
(defengine IMDb-Actor
"http://www.imdb.com/filmosearch?explore=title_type&role=%s&ref_=filmo_ref_job_typ&sort=user_rating"
:keybinding "i a")
(defengine IMDb-Director
"http://www.imdb.com/filmosearch?explore=title_type&role=%s&ref_=filmo_ref_job_typ&sort=user_rating"
:keybinding "i d")
(defengine IMDb-Composer
"http://www.imdb.com/filmosearch?explore=title_type&role=%s&ref_=filmo_ref_job_typ&sort=user_rating"
:keybinding "i c")
(defengine IberLibroArgentina
"https://www.iberlibro.com/servlet/SearchResults?bi=0&bx=off&cm_sp=SearchF-_-Advs-_-Result&cty=ar&ds=20&kn=%s&prc=USD&recentlyadded=all&rgn=ww&rollup=on&sortby=20&xdesc=off&xpod=off"
:keybinding "i r")
(defengine Internet-Archive
"https://archive.org/search.php?query=%s"
:keybinding "v v")
(defengine Internet-Archive-Scholar
"https://scholar.archive.org/search?q=%s"
:keybinding "v s")
(defengine JustWatch
"https://www.justwatch.com/us/search?q=%s"
:keybinding "j w")
(defengine KAYAK
"https://www.kayak.co.uk/sherlock/opensearch/search?q=%s"
:keybinding "k k")
(defengine Keyboard-Maestro
"https://forum.keyboardmaestro.com/search?q=%s"
:keybinding "k m")
(defengine Lastfm
"http://www.last.fm/search?q=%s"
:keybinding "f m")
(defengine LessWrong
"https://www.google.com/search?q=%s+site:lesswrong.com"
:keybinding "l w")
(defengine LessWrongWiki
"https://wiki.lesswrong.com/index.php?title=Special:Search&search=%s"
:keybinding "l i")
(defengine LibraryGenesis
"http://libgen.li/index.php?req=%s"
:keybinding "l l")
(defengine Librivox
"https://librivox.org/search?q=%s&search_form=advanced"
:keybinding "l v")
(defengine LinkedIn
"http://www.linkedin.com/vsearch/f?type=all&keywords=%s&orig=GLHD&rsid=&pageKey=member-home&search=Search"
:keybinding "i n")
(defengine Linguee
"https://www.linguee.com/english-spanish/search?source=auto&query=%s"
:keybinding "l i")
(defengine Marginal-Revolution
"https://marginalrevolution.com/?s=%s"
:keybinding "m r")
(defengine MediaCenter
"https://www.google.com/search?q=%s+site:yabb.jriver.com"
:keybinding "m c")
(defengine Medium
"https://medium.com/search?q=%s&ref=opensearch"
:keybinding "m d")
(defengine Melpa
"https://melpa.org/#/?q=%s"
:keybinding "m p")
(defengine MetaFilter
"https://www.metafilter.com/contribute/search.mefi?site=mefi&q=%s"
:keybinding "m f")
(defengine Metaculus
"https://www.metaculus.com/questions/?order_by=-activity&search=%s"
:keybinding "m e")
(defengine Metaforecast
"https://metaforecast.org/?query=%s"
:keybinding "m m")
(defengine Movielens
"https://movielens.org/explore?q=%s"
:keybinding "m l")
(defengine Netflix
"https://www.netflix.com/search?q=%s"
:keybinding "n n")
(defengine New-York-Times
"https://www.nytimes.com/search?query=%s"
:keybinding "n y")
(defengine Notatu-Dignum
"http://www.stafforini.com/quotes/index.php?s=%s"
:keybinding "q q")
(defengine OddsChecker
"https://www.oddschecker.com/search?query=%s"
:keybinding "o c")
(defengine Open-Philanthropy
"https://www.google.com/search?q=%s+site:openphilanthropy.org"
:keybinding "o p")
(defengine Overcoming-Bias
"https://substack.com/search/%s?focusedPublicationId=1245641"
:keybinding "o b"
:browser 'browse-url-default-browser)
(defengine OxfordReference
"https://www-oxfordreference-com.myaccess.library.utoronto.ca/search?btog=chap&q0=%22%s%22"
:keybinding "o r")
(defengine OxfordReferenceDOI
"https://www-oxfordreference-com.myaccess.library.utoronto.ca/view/%s"
:keybinding "o d")
(defengine PhilPapers
"http://philpapers.org/s/%s"
:keybinding "p p")
(defengine AnnasArchive
(progn
(require 'annas-archive)
(concat annas-archive-home-url "search?index=&page=1&q=%s&ext=pdf&sort="))
:keybinding "r r")
(defengine ReducingSuffering
"http://reducing-suffering.org/?s=%s"
:keybinding "r s")
(defengine Reference
"https://cse.google.com/cse?cx=013594344773078830993:bg9mrnfwe30&q=%s"
:keybinding "r f")
(defengine sci-hub
"https://sci-hub.se/%s"
:keybinding "u u")
(defengine ScienceDirectencyclopedias
"https://www.sciencedirect.com/search?qs=%s&articleTypes=EN"
:keybinding "s e")
(defengine SlateStarCodex
"http://slatestarcodex.com/?s=%s"
:keybinding "s c")
(defengine StackSnippet
"http://www.stacksnippet.com/#gsc.tab=0&gsc.q=%s"
:keybinding "s n")
(defengine Stanford-Encyclopedia-of-Philosophy
"https://plato.stanford.edu/search/searcher.py?query=%s"
:keybinding "s p")
(defengine Tango-DJ
"http://www.tango-dj.at/database/?tango-db-search=%s&search=Search"
:keybinding "d j")
(defengine TangoDJ-Yahoo-Group
"http://groups.yahoo.com/group/TangoDJ/msearch?query=%s&submit=Search&charset=ISO-8859-1"
:keybinding "t y")
(defengine TasteDive
"https://tastedive.com/like/%s"
:keybinding "t d")
(defengine ThreadReader
"https://threadreaderapp.com/search?q=%s"
:keybinding "t r")
(defengine Twitter
"https://twitter.com/search?q=%s&src=typed_query"
:keybinding "t w")
(defengine Vimeo
"http://vimeo.com/search?q=%s"
:keybinding "v m")
(defengine WaybackMachine
"http://web.archive.org/web/*/%s"
:keybinding "w b")
(defengine Wikipedia-Deutsch
"https://de.wikipedia.org/w/index.php?title=Spezial:Suche&search=%s"
:keybinding "w d")
(defengine Wikipedia-English
"http://en.wikipedia.org/w/index.php?title=Special:Search&profile=default&search=%s&fulltext=Search"
:keybinding "w w")
(defengine Wikipedia-French
"http://fr.wikipedia.org/w/index.php?title=Spécial:Recherche&search=%s"
:keybinding "w f")
(defengine Wikipedia-Italiano
"http://it.wikipedia.org/w/index.php?title=Speciale:Ricerca&search=%s"
:keybinding "w i")
(defengine Wikipedia-Spanish
"https://es.wikipedia.org/w/index.php?search=%s&title=Especial:Buscar&ns0=1&ns11=1&ns100=1"
:keybinding "w e")
(defengine Wikipedia-Swedish
"http://sv.wikipedia.org/w/index.php?title=Special:S%C3%B6k&search=%s"
:keybinding "w s")
(defengine Wirecutter
"https://thewirecutter.com/search/?s=%s"
:keybinding "w t")
(defengine WorldCat
"http://www.worldcat.org/search?q=%s&qt=results_page"
:keybinding "w c")
(defengine YahooFinance
"https://finance.yahoo.com/company/%s"
:keybinding "y f")
(defengine YouTube
"https://www.youtube.com/results?search_query=%s"
:keybinding "y t")
(defengine YouTubemovies
"https://www.youtube.com/results?lclk=long&filters=hd%2Clong&search_query=%s"
:keybinding "y m")
:hook minibuffer-setup-hook)
```
`org-download` {#org-download}
_[org-download](https://github.com/abo-abo/org-download) supports drag and drop images to org-mode._
```emacs-lisp
(use-package org-download
:after org
:bind
("H-s-v" . org-download-clipboard))
```
`org-web-tools` {#org-web-tools}
_[org-web-tools](https://github.com/alphapapa/org-web-tools) supports viewing, capturing, and archiving web pages in org-mode._
```emacs-lisp
(use-package org-web-tools
:defer t)
```
`org-web-tools-extras` {#org-web-tools-extras}
_[org-web-tools-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/org-web-tools-extras.el) collects my extensions for `org-web-tools`._
```emacs-lisp
(use-personal-package org-web-tools-extras
:after org-web-tools)
```
`request` {#request}
_[request](https://github.com/tkf/emacs-request) provides HTTP request for Emacs Lisp._
```emacs-lisp
(use-package request
:defer t)
```
`deferred` {#deferred}
_[deferred](https://github.com/kiwanami/emacs-deferred) provides simple asynchronous functions for emacs lisp._
```emacs-lisp
(use-package deferred
:defer t)
```
`graphql-mode` {#graphql-mode}
_[graphql-mode](https://github.com/davazp/graphql-mode) is a major mode for GraphQL._
```emacs-lisp
(use-package graphql-mode
:defer t)
```
`mullvad` {#mullvad}
_[mullvad](https://github.com/benthamite/mullvad) provides a few functions for interfacing with Mullvad, a VPN service._
```emacs-lisp
(use-package mullvad
:ensure (mullvad
:host github
:repo "benthamite/mullvad")
:demand t
:custom
(mullvad-durations '(1 5 10 30 60 120))
(mullvad-cities-and-servers
'(("London" . "gb-lon-ovpn-005")
("Madrid" . "es-mad-ovpn-202")
("Malmö" . "se-sto-wg-005")
("Frankfurt" . "de-fra-wg-005")
("New York" . "us-nyc-wg-504")
("San José" . "us-sjc-wg-101")
("São Paulo" . "br-sao-wg-202")))
(mullvad-websites-and-cities
'(("Betfair" . "London")
("Criterion Channel" . "New York")
("Gemini" . "New York")
("HathiTrust" . "San José")
("IMDb" . "New York")
("Library Genesis" . "Malmö")
("Pirate Bay" . "Malmö")
("UC Berkeley" . "San José")
("Wise" . "Madrid")))
:bind
("A-a" . mullvad))
```
multimedia {#multimedia}
`EMMS` {#emms}
_[EMMS](https://www.gnu.org/software/emms/) (Emacs MultiMedia System) is media player software for Emacs._
EMMS is not powerful enough for my use case (tango DJ with a collection of over 70,000 tracks). But I'm exploring whether I can use it for specific purposes, such as batch-tagging.
```emacs-lisp
(use-package emms
:defer t
:disabled t ; temporarily because server is down
:custom
(emms-player-list '(emms-player-mpv))
(emms-source-file-default-directory paths-dir-music-tango)
(emms-playlist-buffer-name "*Music*")
(emms-info-functions '(emms-info-libtag)) ; make sure libtag is the only thing delivering metadata
;; ~1 order of magnitude fzaster; requires GNU find: `brew install findutils'
(emms-source-file-directory-tree-function 'emms-source-file-directory-tree-find)
:config
(require 'emms-setup)
(require 'emms-player-simple)
(require 'emms-source-file)
(require 'emms-source-playlist)
(require 'emms-info-native)
;; emms-print-metadata binary must be present; see emacs.stackexchange.com/a/22431/32089
(require 'emms-info-libtag)
(require 'emms-mode-line)
(require 'emms-mode-line-icon)
(require 'emms-playing-time)
(emms-all)
(emms-default-players)
(add-to-list 'emms-info-functions 'emms-info-libtag)
(emms-mode-line-mode)
(emms-playing-time 1))
```
`empv` {#empv}
_[empv](https://github.com/isamert/empv.el) is a media player based on [mpv](https://mpv.io/)._
```emacs-lisp
(use-package empv
:ensure (:host github
:repo "isamert/empv.el")
:defer t
:custom
(empv-audio-dir paths-dir-music-tango)
(empv-invidious-instance "https://invidious.fdn.fr/api/v1")
:config
(add-to-list 'empv-mpv-args "--ytdl-format=best")) ; github.com/isamert/empv.el#viewing-youtube-videos
```
`tangodb` {#tangodb}
_[tangodb](https://github.com/benthamite/tangodb.el) is a package for browsing and editing a tango encyclopaedia._
```emacs-lisp
(use-package tangodb
:ensure (:host github
:repo "benthamite/tangodb.el")
:defer t)
```
`trx` {#trx}
_[trx](https://github.com/benthamite/trx) is an Emacs interface for the [Transmission](https://transmissionbt.com/) BitTorrent client._
```emacs-lisp
(use-package trx
:ensure (:host github
:repo "benthamite/trx")
:defer t)
```
`ytdl` {#ytdl}
_[ytdl](https://gitlab.com/tuedachu/ytdl) is an Emacs interface for [youtube-dl](https://youtube-dl.org/)._
Note that this package also works with [yt-dlp](https://github.com/yt-dlp/yt-dlp), a `youtube-dl` fork.
```emacs-lisp
(use-package ytdl
:custom
(ytdl-command "yt-dlp")
(ytdl-video-folder paths-dir-downloads
ytdl-music-folder paths-dir-downloads
ytdl-download-folder paths-dir-downloads)
(ytdl-video-extra-args . ("--write-sub" "--write-auto-sub" "--sub-lang" "en,es,it,fr,pt"))
:bind
(("A-M-y" . ytdl-download)
:map ytdl--dl-list-mode-map
("RET" . ytdl--open-item-at-point)
("D" . ytdl--delete-item-at-point)))
```
`esi-dictate` {#esi-dictate}
_[esi-dictate](https://git.sr.ht/~lepisma/emacs-speech-input) is a set of packages for speech and voice inputs in Emacs._
**Setup**:
1. Install Python dependencies: `pip install 'deepgram-sdk>=3.0,<4.0' pyaudio`.= Note:= the= `dg.py`= script= requires= SDK= v3.x;= v5.x= has= a= completely= different= API= and= will= not= work.= 2.= Download= `dg.py`= from= the= repo= and= place= it= in= your= PATH= (e.g.,= `~/.local/bin/dg.py`),= then= make= it= executable:= `chmod= +x= ~/.local/bin/dg.py`= **Usage**:= `M-x= esi-dictate-start`= to= begin= dictation,= `C-g`= to= stop.= ```emacs-lisp= (use-package= esi-dictate= :ensure= (:host= sourcehut= :repo= "lepisma/emacs-speech-input")= :after= llm= :defer= t= :custom= (esi-dictate-llm-provider= (make-llm-openai= :key= (auth-source-pass-get= "gptel"= (concat= "tlon/core/openai.com/"= tlon-email-shared))= :chat-model= "gpt-4o-mini"))= (esi-dictate-dg-api-key= (auth-source-pass-get= "key"= (concat= "chrome/deepgram.com/"= (getenv= "PERSONAL_EMAIL"))))= :bind= (("A-h"= .= esi-dictate-start)= :map= esi-dictate-mode-map= ("C-g"= .= esi-dictate-stop))= :hook= (esi-dictate-speech-final= .= esi-dictate-fix-context))= ```= `read-aloud`= {#read-aloud}= _[read-aloud](https://github.com/gromnitsky/read-aloud.el)= is= an= Emacs= interface= to= TTS= (text-to-speech)= engines._= -= To= give= Emacs= access= to= the= microphone= on= MacOS,= clone= `https://github.com/DocSystem/tccutil`= and= from= the= cloned= repo,= run= `sudo= python3= tccutil.py= -p= /opt/homebrew/Cellar/emacs-plus@30/30.0.60/Emacs.app/= --microphone= -e`= (some= discussion= [here](https://scsynth.org/t/emacs-scsynth-and-microphone-permissions/3253)).= -= To= read= with= macOS= directly,= `b-n`.= In= turn,= `b-h`= starts= dictation.= (These= are= system-wide= shortcuts= defined= with= Karabiner= rather= than= bindings= specific= to= Emacs.= See= the= "b-mode"= section= in= my= Karabiner= configuration.)= <!--listend--=>
```emacs-lisp
(use-package read-aloud
:custom
(read-aloud-engine "say")
:bind
("C-H-r" . read-aloud-this))
```
`read-aloud-extras` {#read-aloud-extras}
_[read-aloud-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/read-aloud-extras.el) collects my extensions for `read-aloud`._
```emacs-lisp
(use-personal-package read-aloud-extras
:after read-aloud)
```
`subed` {#subed}
_[subed](https://github.com/sachac/subed) is a subtitle editor for Emacs._
```emacs-lisp
(use-package subed
:ensure (:host github
:repo "sachac/subed"
:files ("subed/*.el"))
:defer t
:config
(defun subed-export-transcript ()
"Export a clean transcript of the current subtitle buffer to a file.
This function retrieves all subtitle text, strips any HTML-like tags (such
as WebVTT timing or style tags within the text lines), and then saves the
result to a user-specified file."
(interactive)
(let* ((subtitles (subed-subtitle-list))
(raw-text (if subtitles
(subed-subtitle-list-text subtitles nil) ; nil = do not include comments
""))
(cleaned-text (if (string-empty-p raw-text)
""
(let* ((lines (split-string raw-text "\n" t)) ; OMIT-NULLS is t
(stripped-lines (mapcar #'subed--strip-tags lines))
(unique-lines (seq-uniq stripped-lines)))
(mapconcat #'identity unique-lines "\n"))))
(buffer-filename (buffer-file-name))
(default-output-name (if buffer-filename
(concat (file-name-sans-extension buffer-filename) ".txt")
"transcript.txt"))
(output-file (read-file-name "Export clean transcript to: " nil default-output-name t)))
(if output-file
(progn
(with-temp-file output-file
(insert cleaned-text))
(message "Transcript exported to %s" output-file))
(message "Transcript export cancelled")))))
```
`spofy` {#spofy}
_[spofy](https://github.com/benthamite/spofy) is a Spotify player for Emacs._
```emacs-lisp
(use-package spofy
:ensure (:host github
:repo "benthamite/spofy")
:defer t
:custom
(spofy-client-id (auth-source-pass-get "spofy-id" "chrome/accounts.spotify.com"))
(spofy-client-secret (auth-source-pass-get "spofy-secret" "chrome/accounts.spotify.com"))
(spofy-tab-bar-max-length nil)
(spofy-tab-bar-alignment 'right)
(spofy-enable-tab-bar t)
(spofy-consult-columns '((track 50 40 40) (album 50 40 6) (artist 50 45) (playlist 50 35) (device 45)))
:bind (("A-y" . spofy-menu)))
```
misc {#misc}
`epoch` {#epoch}
```emacs-lisp
(use-package epoch
:ensure (:host github
:repo "benthamite/epoch.el")
:demand t
:bind
(("H-l" . epoch-menu)))
```
`calc` {#calc}
_calc is the Emacs calculator._
```emacs-lisp
(use-feature calc
:config
(with-eval-after-load 'savehist
(add-to-list 'savehist-additional-variables 'calc-quick-calc-history))
:bind
(("A-c" . calc)
("A-M-c" . quick-calc)
:map calc-mode-map
("C-k" . nil)))
```
`calc-ext` {#calc-ext}
_calc-ext provides various extension functions for calc._
```emacs-lisp
(use-feature calc-ext
:after cal
:bind (:map calc-alg-map
("C-k" . nil)))
```
`alert` {#alert}
_[alert](https://github.com/jwiegley/alert) is a Growl-like alerts notifier for Emacs._
```emacs-lisp
(use-package alert
:defer t
:custom
;; the settings below are not working; is it because `alert-default-style' is set to `notifier'?
(alert-fade-time 2)
(alert-persist-idle-time 60)
(alert-default-style 'osx-notifier)
:config
;; This function has to be loaded manually, for some reason.
(defun alert-osx-notifier-notify (info)
(apply #'call-process "osascript" nil nil nil "-e"
(list (format "display notification %S with title %S"
(alert-encode-string (plist-get info :message))
(alert-encode-string (plist-get info :title)))))
(alert-message-notify info)))
```
`midnight` {#midnight}
_midnight runs custom processes every night._
```emacs-lisp
(use-feature midnight
:defer 30
:custom
(clean-buffer-list-kill-never-buffer-names
'("*mu4e-headers*"
" *mu4e-update*"))
(clean-buffer-list-kill-never-regexps
'("\\` \\*Minibuf-.*\\*\\'"
"^untitled.*"))
(clean-buffer-list-delay-general 2) ; kill buffers unused for more than three days
:config
(midnight-mode)
;; setting the delay causes midnight to run immediately, so we set it via a timer
(run-with-idle-timer (* 3 60 60) nil (lambda () (midnight-delay-set 'midnight-delay "5:00am")))
(dolist (fun (nreverse '(files-extras-save-all-buffers
clean-buffer-list
ledger-mode-extras-update-coin-prices
ledger-mode-extras-update-commodities
magit-extras-stage-commit-and-push-all-repos
org-roam-db-sync
org-extras-id-update-id-locations
el-patch-validate-all
org-extras-agenda-switch-to-agenda-current-day
mu4e-extras-update-all-mail-and-index
org-gcal-sync
elfeed-update)))
(add-hook 'midnight-hook
(apply-partially #'simple-extras-call-verbosely
fun "Midnight hook now calling `%s'..."))))
```
`bbdb` {#bbdb}
_[bbdb](https://elpa.gnu.org/packages/bbdb.html) is a contact management package._
A tutorial for this undocumented package may be found [here](https://github.com/andycowl/bbdb3-manual/blob/master/tutorial.rst).
```emacs-lisp
(use-package bbdb
:ensure (:host github
:repo "benthamite/bbdb"
:files (:defaults "lisp/*.el")
:depth nil
:pre-build (("./autogen.sh")
("./configure")
("make"))
:build (:not elpaca-build-docs))
:custom
(bbdb-file (file-name-concat paths-dir-bbdb "bbdn.el"))
(bbdb-image-path (file-name-concat paths-dir-bbdb "media/"))
:config
(bbdb-initialize 'anniv)
:bind
(("A-b" . bbdb)
:map bbdb-mode-map
("A-C-s-r" . bbdb-prev-record)
("A-C-s-f" . bbdb-next-record)
("c" . bbdb-copy-fields-as-kill)
("C-k" . nil)
("M-d" . nil)))
```
`bbdb-extras` {#bbdb-extras}
_[bbdb-extras](https://github.com/benthamite/dotfiles/blob/main/emacs/extras/bbdb-extras.el) collects my extensions for `bbdb`._
```emacs-lisp
(use-personal-package bbdb-extras
:bind (:map bbdb-mode-map
("D" . bbdb-extras-delete-field-or-record-no-confirm)
("E" . bbdb-extras-export-vcard)
("n" . bbdb-extras-create-quick)))
```
`bbdb-vcard` {#bbdb-vcard}
_[bbdb-vcard](https://github.com/tohojo/bbdb-vcard) supports import and export for BBDB._
```emacs-lisp
(use-package bbdb-vcard
:after bbdb
:custom
(bbdb-vcard-directory paths-dir-bbdb)
(bbdb-vcard-media-directory "media"))
```
`macos` {#macos}
_[macos](https://github.com/benthamite/macos) is a simple package I developed that provides a few macOS-specific functions._
```emacs-lisp
(use-package macos
:ensure (:host github
:repo "benthamite/macos")
:custom
(macos-bluetooth-device-list
'(("Sonny WH-1000XM5" . "ac-80-0a-37-41-1e")))
:bind
(("C-M-s-c" . macos-bluetooth-device-dwim)
("C-M-s-g" . macos-set-dication-language)))
```
`keycast` {#keycast}
_[keycast](https://github.com/tarsius/keycast) shows the current command and its key in the mode line._
```emacs-lisp
(use-package keycast
:defer t
:config
;; support for doom modeline (github.com/tarsius/keycast/issues/7)
(with-eval-after-load 'keycast
(define-minor-mode keycast-mode
"Show current command and its key binding in the mode line."
:global t
(if keycast-mode
(add-hook 'pre-command-hook 'keycast--update t)
(remove-hook 'pre-command-hook 'keycast--update)))
(add-to-list 'global-mode-string '("" keycast-mode-line))))
```
`activity-watch-mode` {#activity-watch-mode}
_[activity-watch-mode](https://github.com/wakatime/wakatime-mode) is an Emacs watcher for [ActivityWatch](https://activitywatch.net/)._
```emacs-lisp
(use-package activity-watch-mode
:defer 30
:config
(require 'magit)
(global-activity-watch-mode)
(advice-add 'activity-watch--save :around
(lambda (fn &rest args)
"Ignore errors from activity-watch (e.g. missing directories)."
(condition-case nil
(apply fn args)
(error nil)))))
```
`custom` {#custom}
_[custom](https://github.com/emacs-mirror/emacs/blob/master/lisp/custom.el) provides the Customize interface for user options and themes._
```emacs-lisp
(use-feature custom
:custom
(custom-safe-themes t)
(custom-file (make-temp-file "gone-baby-gone")) ; move unintended customizations to a garbage file
:bind
(:map custom-mode-map
("f" . ace-link-custom)))
```
`mercado-libre` {#mercado-libre}
_[mercado-libre](https://github.com/benthamite/mercado-libre) is a package for querying MercadoLibre, a popular Latin American e-commerce platform._
```emacs-lisp
(use-package mercado-libre
:ensure (:host github
:repo "benthamite/mercado-libre")
:defer t
:custom
(mercado-libre-client-id (auth-source-pass-get "app-id" "chrome/mercadolibre.com/benthamite"))
(mercado-libre-client-key (auth-source-pass-get "app-key" "chrome/mercadolibre.com/benthamite"))
(mercado-libre-new-results-limit nil)
(mercado-libre-listings-db-file
(file-name-concat paths-dir-dropbox "Apps/Mercado Libre/mercado-libre-listings.el")))
```
`polymarket` {#polymarket}
_[polymarket](https://github.com/benthamite/polymarket) is a package to fetch and place trades on Polymarket, a popular prediction market._
(This package is not currently public.)
```emacs-lisp
(use-package polymarket
:ensure (:host github
:repo "benthamite/polymarket")
:defer t)
```
`kelly` {#kelly}
_[kelly](https://github.com/benthamite/kelly) is a Kelly criterion calculator._
```emacs-lisp
(use-package kelly
:ensure (:host github
:repo "benthamite/kelly")
:defer t
:custom
(kelly-b-parameter-type 'probability))
```
`fatebook` {#fatebook}
_[fatebook](https://github.com/sonofhypnos/fatebook.el) is an Emacs package to create predictions on Fatebook._
```emacs-lisp
(use-package fatebook
:ensure (:repo "sonofhypnos/fatebook.el"
:host github
:files ("fatebook.el"))
:commands fatebook-create-question
:custom
(fatebook-api-key-function (lambda () (auth-source-pass-get "api" "chrome/fatebook.io"))))
```
`keyboard-maestro` {#keyboard-maestro}
_Keybindings for triggering Emacs commands from [Keyboard Maestro](https://www.keyboardmaestro.com/)._
These bindings allow Keyboard Maestro to trigger various Emacs processes.
Note to self: The pattern for KM shortcuts is `C-H-<capital-letter>`. This corresponds to `⇧⌘⌃<letter>` in macOS.
```emacs-lisp
(global-set-key (kbd "C-H-Z") 'zotra-extras-add-entry)
```
`tetris` {#tetris}
_[tetris](https://github.com/emacs-mirror/emacs/blob/master/lisp/play/tetris.el) is an implementation of the classic Tetris game for Emacs._
And finally, the section you've all been waiting for.
```emacs-lisp
(use-feature tetris
:bind
(:map tetris-mode-map
("k" . tetris-rotate-prev)
("l" . tetris-move-down)
("j" . tetris-move-left)
(";" . tetris-move-right)))
```
appendices {#appendices}
key bindings {#key-bindings}
Emacs has five native [modifier keys](https://www.gnu.org/software/emacs/manual/html_node/emacs/Modifier-Keys.html): `Control` (`C`), `Meta` (`M`), `Super` (`s`), `Hyper` (`H`), and `Alt` (`A`). (The letter abbreviation for the `Super` modifier is `s` because `S` is assigned to the `Shift` key.) I use [Karabiner-Elements](https://karabiner-elements.pqrs.org/), in combination with a [Moonlander keyboard](https://www.zsa.io/moonlander/), to generate several additional "pseudo modifiers", or mappings between individual keys and combinations of two or more Emacs modifiers:
{{< figure= src="/ox-hugo/moonlander-emacs.png">}}
So when you see a monstrous key binding such as `C-H-M-s-d`, remember that everything that precedes the final key (in this case, `d`) represents a single key press (in this case, `l`). For details, see my Karabiner config file, specifically the "Key associations" section.
I set key bindings in the following ways:
- With the `:bind` keyword of `use-package`.
- For commands provided by the package or feature being loaded in that block.
- For commands provided by other packages or features, when these are being set in a keymap provided by the feature being loaded in that block. This approach is appropriate when one wants to bind a key to a command in a keymap and wants this binding to be active even before the feature providing the command is loaded. Example:
- With `bind-keys` within the `:init` section of `use-package`.
- For commands provided by the feature being loaded in that block that are bound in a keymap provided by another feature. This is appropriate when the feature providing the keymap may load after the feature providing the command. In these cases, using `:bind` is not possible since Emacs will throw an error. Example: binding the command `scroll-down-command`, which is provided by `window`, to the key `y` in `elfeed-show-mode-map`, which is provided by `elfeed`. Note that if the command is not natively autoloaded, an autoload must be set, e.g. `(autoload #'=scroll-down-command "window" nil t)`. [confirm that this is so: it’s possible `bind-keys` automatically creates autoloads, just like the `:bind` keyword does]
- For commands provided by the feature being loaded in that block that are bound globally and should be available even before the feature is configured. (Note that I distinguish between the loading of a feature and its configuration. A key binding specified via the `:bind` keyword of `use-package` will be available before the package is loaded but only after it is configured. If, for example, the block includes the `:after` keyword, the package will only be configured after that condition is satisfied.) Example: `ebib-extras-open-or-switch`, which is provided by `ebib-extras` and will be configured after `ebib` is loaded, yet we want it to be available even before `ebib` is loaded.
profiling {#profiling}
- If you use `use-package`, the command `use-package-report` displays a table showing the impact of each package on load times.
installation {#installation}
For personal reference, these are the most recent Emacs installations (in reverse chronological order).
(After installing, you may need to create a symlink to the `Emacs.app` folder in `/opt/homebrew/Cellar/emacs-plus@30/30.1/Emacs.app`, replacing `30.1` with the actual version number.)<span class="timestamp-wrapper"><span class="timestamp">[2025-06-27 Fri]</span></span>:
```shell
brew tap d12frosted/emacs-plus
brew install emacs-plus@30 --with-dbus --with-debug --with-xwidgets --with-imagemagick --with-spacemacs-icon
```<span class="timestamp-wrapper"><span class="timestamp">[2024-10-09 Wed]</span></span>:
```shell
brew tap d12frosted/emacs-plus
brew install emacs-plus@30 --with-dbus --with-debug --with-native-comp --with-xwidgets --with-imagemagick --with-spacemacs-icon
```<span class="timestamp-wrapper"><span class="timestamp">[2024-03-18 Mon]</span></span>:
```shell
brew tap d12frosted/emacs-plus
brew install emacs-plus@29 --with-dbus --with-debug --with-native-comp --with-xwidgets --with-imagemagick --with-spacemacs-icon
```
???:
```shell
brew tap d12frosted/emacs-plus
brew install emacs-plus@30 --with-dbus --with-debug --with-native-comp --with-xwidgets --with-imagemagick --with-spacemacs-icon
```<span class="timestamp-wrapper"><span class="timestamp">[2023-02-23 Thu 02:10]</span></span>
```shell
brew tap d12frosted/emacs-plus
brew install emacs-plus@28 --with-dbus --with-no-titlebar --with-native-comp --with-xwidgets --with-imagemagick --with-spacemacs-icon
```
- Very slow.
- Theme broke for some reason.
- Some functions (e.g. `keymap-unset`) not available).
- Telega doesn't show profile pics<span class="timestamp-wrapper"><span class="timestamp">[2023-02-14 Tue 20:07]</span></span>:
```shell
brew tap d12frosted/emacs-plus
brew install emacs-plus@30 --with-dbus --with-debug --with-native-comp --with-xwidgets --with-imagemagick --with-spacemacs-icon
```<span class="timestamp-wrapper"><span class="timestamp">[2023-02-07 Tue 21:52]</span></span>:
```shell
brew install emacs-mac --with-dbus --with-starter --with-natural-title-bar --with-native-comp --with-mac-metal --with-xwidgets --with-imagemagick --with-librsvg --with-spacemacs-icon
```
other config files {#other-config-files}
The below is a link dump for config files and other related links I have found useful in the past or may want to check out for ideas at some point in the future.
- [Awesome Emacs](https://github.com/emacs-tw/awesome-emacs): A list of useful Emacs packages.
- [How to build your own spacemacs · Samuel Barreto](https://sam217pa.github.io/2016/09/02/how-to-build-your-own-spacemacs/)
- [Using SpaceMacs mode-line in vanilla Emacs : emacs](https://www.reddit.com/r/emacs/comments/3lt3c6/using_spacemacs_modeline_in_vanilla_emacs/)
- [How does Emacs Doom start so quickly?](https://github.com/hlissner/doom-emacs/blob/develop/docs/faq.org#how-does-doom-start-up-so-quickly) Might be useful for ideas on how to speed up config file.
- [Emacs Prelude](https://prelude.emacsredux.com/en/latest/). I've seen this recommended. Might want to check it out.
- [Polishing my Emacs -- who said an old tool can't look modern](https://www.reddit.com/r/emacs/comments/ehjcu2/screenshot_polishing_my_emacs_who_said_an_old/)
[.emacs.d-oldv2/init-keymaps.el at master · mbriggs/.emacs.d-oldv2](https://github.com/mbriggs/.emacs.d-oldv2/blob/master/init/init-keymaps.el). Lots of key bindings.
Literate configuration
- [Setting up a spacemacs literate config file](https://commonplace.doubleloop.net/setting-up-a-spacemacs-literate-config-file)
- [Does anyone have their dotfile redone in literate programming with babel? : spacemacs](https://www.reddit.com/r/spacemacs/comments/atuzd9/does_anyone_have_their_dotfile_redone_in_literate/)
- Diego Zamboni, _[Literate configuration](https://leanpub.com/lit-config)_
- [elisp - Can I use org-mode to structure my .emacs or other .el configuration file? - Emacs Stack Exchange](https://emacs.sjtackexchange.com/questions/3143/can-i-use-org-mode-to-structure-my-emacs-or-other-el-configuration-file)
Some useful config files:
- [Alex Bennée](https://github.com/stsquad/my-emacs-stuff).
- [Diego Zamboni](https://zzamboni.org/post/my-emacs-configuration-with-commentary/)
- [Jamie Collinson](https://jamiecollinson.com/blog/my-emacs-config/)
- [Jethro Kuan](https://github.com/jethrokuan/dots/blob/master/.doom.d/config.el). Creator or `org-roam` and author of some great posts on note-taking. Not literal.
- [Joost Diepenmat](https://github.com/joodie/emacs-literal-config/blob/master/emacs.org)
- [Gregory Stein](https://github.com/gjstein/emacs.d). Author of the excellent [Caches to Caches](http://cachestocaches.com/) blog.
- [Luca Cambiaghi](https://luca.cambiaghi.me/vanilla-emacs/readme.html)
- [Lucien Cartier-Tilet](https://config.phundrak.com/emacs) (Spacemacs)
- [Isa Mert Gurbuz](https://github.com/isamert/dotfiles/blob/master/emacs/index.org)
- Has a cool [blog](https://isamert.net/index.html) about org mode and other topics.
- [Martin Foot](https://www.mfoot.com/blog/2015/11/22/literate-emacs-configuration-with-org-mode/)
- Has a very simple init file.
- [.org file](https://github.com/mfoo/dotfiles/blob/master/.emacs.d/config.org)
- [Murilo Pereira](https://github.com/mpereira/.emacs.d).
- Very well organized. The author has also written some excellent blog posts about Emacs.
- [OutOfCheeseError](https://out-of-cheese-error.netlify.app/spacemacs-config)
- [Protesilaos Stavrou](https://protesilaos.com/dotemacs/)
- [here](https://gitlab.com/protesilaos/dotfiles/-/blob/350ca3144c5ee868056619b9d6351fca0d6b131e/emacs/.emacs.d/emacs-init.org) is the last commit before he abandoned `use-package` and `straight`
- [Sacha Chua](https://pages.sachachua.com/.emacs.d/Sacha.html). A legend in the Emacs community.
- [Karl Voit](https://github.com/novoid/dot-emacs/blob/master/config.org).
- Author of `Memacs`, prolific blogger.
- [Sriram Krishnaswamy](https://github.com/sriramkswamy/dotemacs) ([website](https://sriramkswamy.github.io/))
- [.org file](https://sriramkswamy.github.io/dotemacs/)
- [Stephen Fromm](https://github.com/sfromm/emacs.d#twitter). Has an extended list of config files [here](https://github.com/sfromm/emacs.d#inspiration).
- [Tecosaur](https://tecosaur.github.io/emacs-config/config.html)
- [Tim Quelch](https://www.tquelch.com/posts/emacs-config/#languages)
- [Vianney Lebouteiller](http://irfu.cea.fr/Pisp/vianney.lebouteiller/emacs.html#orgbcdc8b2)
- [Xuan Bi](https://github.com/bixuanzju/emacs.d/blob/master/emacs-init.org#meta).
- [GitHub - turbana/emacs-config: My personal emac's configuration](https://github.com/turbana/emacs-config). Some potentially useful stuff on native comp, debugging, etc.
- [dotfiles/.emacs at master · creichert/dotfiles · GitHub](https://github.com/creichert/dotfiles/blob/master/emacs/.emacs). Has detailed Gnus, Slack config.
- [yay-evil-emacs](https://github.com/ianpan870102/yay-evil-emacs). slick design.
- [GitHub - rememberYou/.emacs.d: 🎉 Personal GNU Emacs configuration](https://github.com/rememberYou/.emacs.d). Has a bunch of Reddit posts explaining how he uses the different packages.
- [emacs-config/config.org at master · nkicg6/emacs-config · GitHub](https://github.com/nkicg6/emacs-config/blob/master/config.org). Found it while searching for org-ref.
- [dot-emacs/init.el at master · yiufung/dot-emacs · GitHub](https://github.com/yiufung/dot-emacs/blob/master/init.el). Not literal. Lots of packages. Gnus, notmuch, Slack, etc. Author has great post on Anki.
- [GitHub - tshu-w/.emacs.d: My personal Emacs config, based on Spacemacs](https://github.com/tshu-w/.emacs.d). Has nice note-taking config, with org-roam, org-ref, Zotero, etc (see [here](https://github.com/tshu-w/.emacs.d/blob/master/lisp/lang-org.el)).
- [Radon Rosborough](https://github.com/raxod502/radian/blob/e3aad124c8e0cc870ed09da8b3a4905d01e49769/emacs/radian.el). Author of `straight` package manager.
- [Gonçalo Santos](https://github.com/weirdNox/dotfiles/blob/master/config/.config/emacs/config.org). Author of `org-noter`.
- [Tony Aldon](https://github.com/tonyaldon/emacs.d/blob/master/init.el). Has some slick [videos](https://www.youtube.com/channel/UCQCrbWOFRmFYqoeou0Qv3Kg) on `org-table`. Optimized key bindings.
- [Nicholas Vollmer](https://github.com/progfolio/.emacs.d/blob/master/init.org). Maitantainer of `elpaca`. I copied his `org-habits` config. Haven't yet looked at the rest but looks like there's plenty of valuable material.
- [emacs-config/config.org at master · yantar92/emacs-config · GitHub](https://github.com/yantar92/emacs-config/blob/master/config.org). Focus on knowledge management with org. Lots of good stuff.
- [Álvaro Ramírez](https://github.com/xenodium/dotsies/blob/main/dots.org). Also users Karabiner.
- [Karthik Chikmagalur](https://github.com/karthink/.emacs.d). Has excellent blog posts on `avy`, `eshell`, `re-builder`, etc.
- [Iqbal Ansari](https://github.com/iqbalansari/dotEmacs).
- [Daniel Clemente](https://www.danielclemente.com/emacs/confi.html).
- [Patrick Elliott](https://github.com/patrl)]]></description></item><item><title>My keyboard setup</title><link>https://stafforini.com/notes/my-keyboard-setup/</link><pubDate>Thu, 19 Feb 2026 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/my-keyboard-setup/</guid><description>&lt;![CDATA[This is a [literate](https://en.wikipedia.org/wiki/Literate_programming) configuration for my keyboard setup: a pair of split mechanical keyboards combined with an extensive software remapping layer on macOS. The system gives me access to symbols, diacritics, navigation, deletion commands, app launching, media controls, and more—all without leaving the home row.
The configuration has two layers:
1. **Keyboard firmware** (QMK): maps the thumb keys to specific modifier keycodes (see [firmware configuration](#firmware-configuration)). Everything else on the keyboard stays as standard QWERTY.
2. **Karabiner + Goku**: handles all the complex behavior—dual-function keys (tap vs. hold), simlayers, app-specific rules, Unicode character input, and deep Emacs integration.
This separation keeps the firmware simple and portable: the same Karabiner configuration works with both of my keyboards, and could work with any keyboard that sends the right modifier keycodes from its thumb keys.
Hardware {#hardware}
I alternate between two split keyboards:
- **[Corne](https://github.com/foostan/crkbd)** (crkbd): a 3x5+3 split keyboard—three rows of five keys per side, plus three thumb keys per side. 36 keys total.
- **[ZSA Moonlander](https://www.zsa.io/moonlander)**: a larger split keyboard with a full alpha block, number row, and a thumb cluster. I use only the alpha keys and thumb cluster; the rest is inert.
Both run [QMK](https://qmk.fm/) firmware. The Corne uses the `corne_rotated` variant with the `LAYOUT_split_3x5_3` matrix.
I use the same layout on the built-in MacBook keyboard. The six keys on the bottom row—left control, left option, left command, space, right command, and right option—correspond to the six thumb keys from left to right. No firmware configuration is needed: these keys already send the expected keycodes natively.
Firmware configuration {#firmware-configuration}
The firmware does almost nothing. The alpha keys are standard QWERTY, and the six thumb keys send modifier keycodes:
| Position | Keycode | VIA name | QMK equivalent |
|--------------|-----------------|-----------|----------------|
| Left outer | `left_control` | Left Ctrl | `KC_LCTL` |
| Left middle | `left_option` | Left Alt | `KC_LALT` |
| Left inner | `left_command` | Left Win | `KC_LGUI` |
| Right inner | `spacebar` | Space | `KC_SPC` |
| Right middle | `right_command` | Right Win | `KC_RGUI` |
| Right outer | `right_option` | Right Alt | `KC_RALT` |
These keycodes don't do what their names suggest. They're just handles for Karabiner to intercept and redefine. The actual behavior of each key is determined entirely in software (see [individual bindings](#individual-bindings)).
Setup steps {#setup-steps}
Step 1: configure the keyboard firmware {#step-1-configure-the-keyboard-firmware}
If you have a normal keyboard (no programmable thumb keys), skip to step 2. The MacBook's built-in keyboard already sends the right keycodes from its bottom-row modifier keys, so no firmware step is needed.
If you have a split or ergonomic keyboard with programmable thumb keys, use the keyboard's firmware to map the thumb keys as follows. The alpha keys should remain standard QWERTY; only the thumb keys need to be changed.
| position | keycode to send | VIA name | QMK equivalent |
|--------------|-----------------|-----------|----------------|
| left outer | `left_control` | Left Ctrl | `KC_LCTL` |
| left middle | `left_option` | Left Alt | `KC_LALT` |
| left inner | `left_command` | Left Win | `KC_LGUI` |
| right inner | `spacebar` | Space | `KC_SPC` |
| right middle | `right_command` | Right Win | `KC_RGUI` |
| right outer | `right_option` | Right Alt | `KC_RALT` |
**Option A: VIA** (easiest, no install required):
1. Open [usevia.app](https://usevia.app) in Chrome (requires WebHID support).
2. Plug in the keyboard via USB. It should auto-detect; if not, your keyboard may need VIA support enabled in its firmware, or you may need to load a JSON definition file from the manufacturer.
3. Click on each thumb key and assign the keycodes shown above. In VIA's keycode picker, they are labeled Left Ctrl, Left Alt, Left Win, Space, Right Win, and Right Alt.
4. Changes take effect immediately—no compiling or flashing needed.
5. For split keyboards, only one half needs to be connected via USB; the other half communicates via the TRRS cable.
**Option B: QMK Configurator** (if VIA is not supported):
1. Go to [QMK Configurator](https://config.qmk.fm) and select your keyboard.
2. Set the alpha keys to standard QWERTY and the thumb keys as shown above (QMK uses PC naming: Alt = Option, GUI = Command).
3. You do not need any firmware layers—Karabiner handles everything.
4. Click "Compile", then "Download firmware" to get the `.hex` file.
5. Install [QMK Toolbox](https://github.com/qmk/qmk_toolbox) (`brew install --cask qmk-toolbox`).
6. Open QMK Toolbox, load the `.hex` file, and enable "Auto-Flash".
7. Put the keyboard into bootloader mode (typically by double-tapping the reset button on the controller). QMK Toolbox should print "Caterina device connected" (for Pro Micro) or "DFU device connected" (for Elite-C) and begin flashing automatically.
8. For split keyboards, **flash each half separately**: connect USB to one half at a time (without the TRRS cable), flash it, then do the same for the other half. Only one half should be connected to the computer via USB during normal use; the two halves communicate via the TRRS cable.
Once configured, verify the keyboard appears in Karabiner-Elements &gt; Devices and that its vendor ID matches the `:devices` section of this config.
Step 2: install Karabiner and Goku {#step-2-install-karabiner-and-goku}
1. Install [Karabiner-Elements](https://karabiner-elements.pqrs.org/).
2. Install [Goku](https://github.com/yqrashawn/GokuRakuJoudo).
3. Clone this repo and load the config in `dotfiles/karabiner/karabiner.edn`.
Individual bindings {#individual-bindings}
Every thumb key has two roles: one when tapped, another when held. These roles change depending on whether the active application is Emacs.
In Emacs {#in-emacs}
| Thumb key | Tap | Hold | Emacs modifier |
|-----------------|--------------------------|------|-----------------|
| `left_control` | `C-g` (quit) | `M-` | Meta |
| `left_option` | `RET` | `S-` | Shift |
| `left_command` | `C-H-0` (toggle windows) | `H-` | Hyper |
| `spacebar` | `⌘Tab` (toggle apps) | `C-` | Control |
| `right_command` | `SPC` | `A-` | Alt (in combos) |
| `right_option` | `TAB` | `s-` | Super |
The `right_command` key is special: on its own it always sends spacebar (both tapped and held), but when held simultaneously with another thumb key, it activates the `A-` (Alt) modifier in Emacs. The five combinations are:
| Combination | Emacs modifier |
|----------------------------------|----------------|
| `right_command` + `left_command` | `A-H-` |
| `right_command` + `left_option` | `A-` |
| `right_command` + `spacebar` | `A-C-` |
| `right_command` + `left_control` | `A-M-` |
| `right_command` + `right_option` | `A-s-` |
With six independent Emacs modifiers (`A-`, `C-`, `H-`, `M-`, `S-`, `s-`), I have a vast keybinding space. Multi-modifier combinations like `A-C-H-M-s-` (all five non-Shift modifiers at once) are possible and used in practice, giving me access to thousands of unique Emacs bindings from a 36-key keyboard.
The simlayers extend this further. Each simlayer key, when held, maps to a specific multi-modifier combination in Emacs. For example, holding `,` activates the `A-H-M-` prefix, so `,` + `s` sends `A-H-M-s` (bound to `simple-extras-transpose-chars-backward`). This turns each simlayer into a dedicated command palette:
| Simlayer key | Emacs modifier | Function |
|--------------|----------------|-------------------|
| `b` | `C-H-` | Window sizing |
| `j` | `C-H-M-` | Deletion |
| `x` | `C-H-s-` | Avy jump |
| `z` | `A-C-s-` | Cursor movement |
| `,` | `A-H-M-` | Transposition |
| `.` | `A-C-H-` | Text manipulation |
| `/` | `C-H-M-s-` | Org-mode commands |
The remaining simlayers (`k`, `m`, `p`, `q`, `v`, `y`, `;`) don't use Emacs modifier prefixes—they insert characters directly or control the mouse.
Outside Emacs {#outside-emacs}
| Thumb key | Tap | Hold |
|-----------------|----------------------|------|
| `left_control` | `Escape` | `⌥` |
| `left_option` | `Enter` | `⇧` |
| `left_command` | `⌘⌃0` (toggle tabs) | `⌘` |
| `spacebar` | `⌘Tab` (toggle apps) | `⌃` |
| `right_command` | `Spacebar` | --- |
| `right_option` | `Tab` | `⌥` |
The `spacebar` key also has a combination effect outside Emacs: pressing `left_command` + `spacebar` triggers `` ⌘` `` (move focus to next window of the current app—e.g. the next browser window if more than one window is open).
`left_control` {#left-control}
Tapped: `C-g` (quit) in Emacs, `Escape` outside. Held: `M-` (Meta) in Emacs, `Option` outside. Two special rules intercept `⌘h` and `⌘⌥h` (which macOS normally uses to hide windows) and remap them to Emacs-compatible modifier chords.
```clojure
{:des "left_control → C-g/Escape (alone) | M-/⌥ (held)"
:rules [
[:!Ch {:modi :A-H-M-s- :key :h} :emacs]
[:!CQh {:modi :A-C-H-s- :key :h} :emacs]
[:##left_control :M- [:!steam :emacs] {:alone {:modi :C- :key :g}}]
[:##left_control :⌥ [:!steam :!emacs] {:alone :escape}]
]}
```
`left_option` {#left-option}
Tapped: `return` (which I bind to toggling between recent windows or tabs). Held: `S-` (Shift) in Emacs, `Shift` outside. Since this key now serves double duty as both Shift and Return, the combination `Shift+Return` is no longer directly available. A workaround binds `left_option` + `right_command` (i.e. Shift + spacebar) to `Shift+Enter`.
```clojure
{:des "left_option → RET (alone) | S-/⇧ (held)"
:rules [
[:##left_option :S- :emacs {:alone :return_or_enter}]
[:##left_option :⇧ :!emacs {:alone :return_or_enter}]
[:!Sright_command {:modi :⇧ :key :return_or_enter}]
]}
```
`left_command` {#left-command}
When tapped, toggles between the two most recent windows (in Emacs) or tabs (in a browser). In Emacs, `C-H-0` is bound to `window-extras-switch-to-last-window`. In Chrome and Firefox, I use the [CLUT](https://chrome.google.com/webstore/detail/clut-cycle-last-used-tabs/cobieddmkhhnbeldhncnfcgcaccmehgn) and [Last Tab](https://addons.mozilla.org/en-US/firefox/addon/last-tab/?utm_source=addons.mozilla.org&utm_medium=referral&utm_content=search) extensions with `⌘⌃0` as the shortcut.
```clojure
{:des "left_command → other window/tab (alone) | H-/⌘ (held)"
:rules [
[:##left_command :H- :emacs {:alone {:modi :C-H- :key :0}}]
[:##left_command :⌘ :!emacs {:alone {:modi :⌘⌃ :key :0}}]
]}
```
`spacebar` {#spacebar}
Tapped: `⌘Tab` (switch apps). Held: `C-` (Control) in Emacs, `⌃` outside. When tapped while the command key is held, toggles between buffers (in Emacs) or moves focus to the next window (outside Emacs).
```clojure
{:des "spacebar → ⌘Tab (alone) | C-/⌃ (held)"
:rules [
[:!Cspacebar {:modi :A-H-M-s- :key :spacebar} :emacs]
[:##spacebar :C- :emacs {:alone {:modi :⌘ :key :tab}}]
[:!Cspacebar {:modi :⌘ :key :grave_accent_and_tilde} :!emacs]
[:##spacebar :⌃ :!emacs {:alone {:modi :⌘ :key :tab}}]
]}
```
`right_command` {#right-command}
On its own, `right_command` always sends spacebar---both tapped and held, both in and outside Emacs. Its special role is as a modifier _combiner_: when held simultaneously with another thumb key in Emacs, it activates the `A-` (Alt) modifier, creating two-modifier chords. This is what gives me access to the sixth Emacs modifier without a dedicated physical key for it.
```clojure
{:des "right_command modifier combinations"
:rules [
[[:right_command :left_command] :A-H- :emacs {:alone {:modi :A- :key :spacebar}}]
[[:right_command :left_option] :A- :emacs]
[[:right_command :spacebar] :A-C- :emacs]
[[:right_command :left_control] :A-M- :emacs]
[[:right_command :right_option] :A-s- :emacs]
]}
```
```clojure
{:des "right_command → spacebar"
:rules [
[:##right_command :spacebar]
]}
```
`right_option` {#right-option}
Tapped: `Tab`. Held: `s-` (Super) in Emacs, `Option` outside.
```clojure
{:des "right_option → Tab (alone) | s-/⌥ (held)"
:rules [
[:##right_option :s- :emacs {:alone :tab}]
[:##right_option :⌥ :!emacs {:alone :tab}]
]}
```
mouse {#mouse}
My Logitech MX Anywhere 3S mouse does not have a [tilting wheel](https://computer.howstuffworks.com/mouse11.htm), so I remap the side buttons to trigger the relevant tab navigation shortcuts when pressed while the shift key is held.
```clojure
{:des "left_shift + side mouse buttons → navigate open tabs"
:rules [
[{:pkey :button4 :modi :left_shift} :!COleft_arrow]
[{:pkey :button5 :modi :left_shift} :!COright_arrow]
]}
```
Note that for these “extra” mouse buttons to be detected by Karabiner, you may need to enable the relevant device in the “Devices” section.
Layers {#layers}
Simlayers are the most distinctive feature of this setup. A simlayer activates when you hold down a specific key and press another key simultaneously. Unlike traditional layers that require a dedicated layer-switch key, simlayers let any regular typing key double as a layer trigger—the key still types its letter on a normal tap, but holding it down and pressing a second key activates the layer.
I have 14 simlayers, each triggered by a different home-row or common key. The letter keys used as triggers are chosen from the least frequent letters in English, minimizing accidental activations during normal typing:
| Key | Frequency | Function |
|-----|-----------|-------------------|
| `z` | 0.074% | Navigation |
| `q` | 0.095% | App launcher |
| `x` | 0.15% | Avy jump |
| `j` | 0.15% | Deletion |
| `k` | 0.77% | Special chars |
| `v` | 0.98% | Numbers |
| `b` | 1.5% | Media/windows |
| `p` | 1.9% | Diacritics |
| `y` | 2.0% | Mouse/screenshots |
| `m` | 2.4% | Math symbols |
(Source: [English letter frequency on Wikipedia](https://en.wikipedia.org/wiki/Letter_frequency).)
The remaining four simlayers use punctuation keys: `,` (transposition), `.` (text manipulation), `;` (symbols), and `/` (Org-mode).
{{< figure= src="/images/keyboard/base.svg">}}
`b-mode`: media/windows {#b-mode-media-windows}
Hold `b` for media controls and window sizing. The right-hand top row keys control window placement (via Emacs Lisp commands in Emacs, or via [Rectangle](https://github.com/rxhanson/Rectangle) otherwise), the home row keys control playback, and the bottom row keys control volume. `h` and `n` control dictation and narration, respectively (the keyboard shortcuts need to be set under Settings &gt; Keyboard &gt; Dictation &gt; Shortcut for dictation, and Settings &gt; Accessibility &gt; Spoken content &gt; Speak selection for narration).
{{< figure= src="/images/keyboard/b-mode.svg">}}
```clojure
{:des "b-mode (media controls, window sizing)"
:rules [:b-mode
[:u {:modi :C-H- :key :u} :emacs]
[:u {:modi :⇧⌘⌥⌃ :key :u}]
[:p {:modi :C-H- :key :p} :emacs]
[:p {:modi :⇧⌘⌥⌃ :key :p}]
[:i {:modi :C-H- :key :i} :emacs]
[:i {:modi :⇧⌘⌥⌃ :key :i}]
[:o {:modi :⇧⌘⌥⌃ :key :o}]
[:j :rewind]
[:x {:modi :⇧⌘⌃ :key :3}]
[:k :play_or_pause]
[:semicolon :fast_forward]
[:comma :volume_increment]
[:period :volume_decrement]
[:m :mute]
[:c {:modi :C-M-s- :key :c} :emacs]
[:g {:modi :C-M-s- :key :g} :emacs]
[:h {:modi :⇧⌘⌥⌃ :key :0}]
[:n {:modi :⇧⌘⌥⌃ :key :1}]
]}
```
`k-mode`: special characters {#k-mode-special-characters}
Hold `k` to type characters from various European languages and typographic symbols (eth, thorn, sharp s, guillemets, interrobang, etc.).
{{< figure= src="/images/keyboard/k-mode.svg">}}
```clojure
{:des "k-mode (special chars)"
:rules [:k-mode
[:d :ð]
[:e :…]
[:i :¿]
[:j [:insert "⸘"]]
[:k [:insert "‽"]]
[:o :ø]
[:p :£]
[:q :œ]
[:r :€]
[:s :ß]
[:t :þ]
[:u :•]
[:slash :¡]
[:comma :«]
[:period :»]
]}
```
`j-mode`: deletion {#j-mode-deletion}
Hold `j` for deletion commands. Outside Emacs, the keys map to standard macOS editing shortcuts (e.g. `s` for backspace, `d` for delete forward, `q` for delete word backward, `w` for delete to line start, `e` for delete to line end). In Emacs, each key triggers a dedicated deletion function via the `C-H-M-` modifier prefix, covering characters, words, sentences, sexps, and more.
{{< figure= src="/images/keyboard/j-mode-emacs.svg">}}
{{< figure= src="/images/keyboard/j-mode-macos.svg">}}
```clojure
{:des "j-mode (deletion)"
:rules [:j-mode
[:a {:modi :C-H-M- :key :a} :emacs]
[:s {:modi :C-H-M- :key :s} :emacs]
[:s :delete_or_backspace :!emacs]
[:d {:modi :C-H-M- :key :d} :emacs]
[:d :delete_forward :!emacs]
[:f {:modi :C-H-M- :key :f} :emacs]
[:q {:modi :C-H-M- :key :q} :emacs]
[:q {:modi :⌥ :key :delete_or_backspace} :!emacs]
[:w {:modi :C-H-M- :key :w} :emacs]
[:w {:modi :⌘ :key :delete_or_backspace} :!emacs]
[:e {:modi :C-H-M- :key :e} :emacs]
[:e {:modi :⌃ :key :k} :!emacs]
[:r {:modi :C-H-M- :key :r} :emacs]
[:r {:modi :⌥ :key :delete_forward} :!emacs]
[:z {:modi :C-H-M- :key :z} :emacs]
[:z {:modi :⌘ :key :delete_or_backspace} :!emacs]
[:x {:modi :C-H-M- :key :x} :emacs]
[:x {:modi :⌘⌥ :key :left_arrow} :!emacs]
[:c {:modi :C-H-M- :key :c} :emacs]
[:c {:modi :⌘⌥ :key :right_arrow} :!emacs]
[:v {:modi :C-H-M- :key :v} :emacs]
[:v {:modi :⌘ :key :delete_forward} :!emacs]
[:b {:modi :C-H-M- :key :b} :emacs]
[:t {:modi :C-H-M- :key :t} :emacs]
[:t :home :!emacs]
[:g {:modi :C-H-M- :key :g} :emacs]
[:g :end :!emacs]
]}
```
`m-mode`: math symbols {#m-mode-math-symbols}
Hold `m` for mathematical operators. Some of these characters have native Option+key shortcuts on the ABC Extended input source. The rest are inserted via a shell command that copies the character to the clipboard and pastes it with `⌘V`. This means that using them will overwrite whatever is currently on the clipboard.
{{< figure= src="/images/keyboard/m-mode.svg">}}
```clojure
{:des "m-mode (math symbols)"
:rules [:m-mode
[:e :=]
[:p :+]
[:q :≠]
[:d :÷]
[:o :±]
[:x [:insert "×"]]
[:comma :≤]
[:period :≥]
[:a [:insert "≈"]]
[:i [:insert "∞"]]
[:s [:insert "−"]]
[:t :·]
[:g :°]
[:r [:insert "√"]]
[:c [:insert "π"]]
]}
```
`p-mode`: diacritics {#p-mode-diacritics}
Hold `p` to add a diacritical mark to the next character you type. This uses macOS's "ABC - Extended" input source. For example, typing `p` + `e` then `e` produces é; typing `p` + `u` then `o` produces ö.
{{< figure= src="/images/keyboard/p-mode.svg">}}
```clojure
{:des "p-mode (diacritics)"
:rules [:p-mode
[:a :macron]
[:b :breve]
[:c :cedilla]
[:e :accute_accent]
[:g :undercomma]
[:h :underbar]
[:i :horn]
[:j :double_acute_accent]
[:k :overring]
[:l :stroke]
[:m :ogonek]
[:n :tilde_accent]
[:r :grave_accent]
[:u :umlaut]
[:v :caron]
[:w :underdot]
[:x :overdot]
[:y :circumflex]
[:z :hook]
]}
```
`q-mode`: apps {#q-mode-apps}
Hold `q` to open an application. 20+ applications are mapped. The letter choices are mnemonic where possible (e.g. `e` for Emacs, `f` for Firefox, `t` for Terminal).
{{< figure= src="/images/keyboard/q-mode.svg">}}
```clojure
{:des "q-mode (apps)"
:rules [:q-mode
[:b [:open "/Applications/qBitTorrent.app"]]
[:c [:open "/Applications/Audacity.app"]]
[:d [:open "/System/Library/CoreServices/Finder.app"]]
[:e [:open "/Applications/Emacs.app"]]
[:f [:open "/Applications/Firefox.app"]]
[:g [:open "/Applications/GoldenDict-ng.app"]]
[:h [:open "/Applications/Google Chrome.app"]]
[:i [:open "/Applications/Anki.app"]]
[:j [:open "/System/Applications/System Settings.app"]]
[:l [:open "/Applications/DeepL.app"]]
[:m [:open "/Applications/Media Center 29.app"]]
[:o [:open "/Applications/zoom.us.app"]]
[:p [:open "/Applications/Beeper Desktop.app"]]
[:r [:open "/Applications/Karabiner-Elements.app"]]
[:s [:open "/Applications/Slack.app"]]
[:t [:open "/System/Applications/Utilities/Terminal.app"]]
[:v [:open "/Applications/mpv.app"]]
[:w [:open "/Applications/HoudahSpot.app"]]
[:x [:open "/Applications/Plex Media Server.app/"]]
[:y [:open "/Applications/Spotify.app"]]
[:comma [:open "/Applications/Home Assistant.app"]]
[:period [:open "/Applications/Tor Browser.app"]]
[:slash [:open "/Applications/Safari.app"]]
]}
```
`v-mode`: numbers {#v-mode-numbers}
Hold `v` to turn the right hand into a numpad. This is one of the most-used simlayers—on a 36-key keyboard with no number row, this is how I type numbers.
{{< figure= src="/images/keyboard/v-mode.svg">}}
```clojure
{:des "v-mode (numbers)"
:rules [:v-mode
[:i :8]
[:##i :##8]
[:j :4]
[:##j :4]
[:k :5]
[:##k :5]
[:l :6]
[:##l :6]
[:m :1]
[:##m :1]
[:o :9]
[:##o :9]
[:u :7]
[:##u :7]
[:comma :2]
[:##comma :2]
[:period :3]
[:##period :3]
[:p :0]
[:##p :0]
[:semicolon :period]
[:##semicolon :period]
]}
```
`x-mode`: avy {#x-mode-avy}
Hold `x` to trigger [Avy](https://github.com/abo-abo/avy) jump commands in Emacs (this simlayer is Emacs-only). Each key maps to a unique Avy command via the `C-H-s-` modifier prefix, providing instant, keyboard-driven navigation to any visible position in the buffer.
{{< figure= src="/images/keyboard/x-mode.svg">}}
```clojure
{:des "x-mode (avy)"
:rules [:x-mode
[:a [{:modi :C-H-s- :key :a}]]
[:b [{:modi :C-H-s- :key :b}]]
[:c [{:modi :C-H-s- :key :c}]]
[:d [{:modi :C-H-s- :key :d}]]
[:e [{:modi :C-H-s- :key :e}]]
[:f [{:modi :C-H-s- :key :f}]]
[:g [{:modi :C-H-s- :key :g}]]
[:h [{:modi :C-H-s- :key :h}]]
[:i [{:modi :C-H-s- :key :i}]]
[:j [{:modi :C-H-s- :key :j}]]
[:k [{:modi :C-H-s- :key :k}]]
[:l [{:modi :C-H-s- :key :l}]]
[:m [{:modi :C-H-s- :key :m}]]
[:n [{:modi :C-H-s- :key :n}]]
[:o [{:modi :C-H-s- :key :o}]]
[:p [{:modi :C-H-s- :key :p}]]
[:q [{:modi :C-H-s- :key :q}]]
[:r [{:modi :C-H-s- :key :r}]]
[:s [{:modi :C-H-s- :key :s}]]
[:t [{:modi :C-H-s- :key :t}]]
[:u [{:modi :C-H-s- :key :u}]]
[:v [{:modi :C-H-s- :key :v}]]
[:w [{:modi :C-H-s- :key :w}]]
[:y [{:modi :C-H-s- :key :y}]]
[:z [{:modi :C-H-s- :key :z}]]
[:semicolon [{:modi :C-H-s- :key :semicolon}]]
[:comma [{:modi :C-H-s- :key :comma}]]
[:period [{:modi :C-H-s- :key :period}]]
[:slash [{:modi :C-H-s- :key :slash}]]
[:spacebar [{:modi :C-H-s- :key :spacebar}]]
[:return_or_enter [{:modi :C-H-s- :key :return_or_enter}]]
[:tab [{:modi :C-H-s- :key :tab}]]
]}
```
`y-mode`: mouse/screenshots {#y-mode-mouse-screenshots}
Hold `y` to move the mouse cursor with the keyboard (`a` and `f` for left/right, `s` and `d` for up/down, `q` and `r` and `w` and `e` for larger jumps) or take screenshots (`z` for full screen, `c` for selected area, `v` to copy selected area to clipboard). `Enter` and `Space` act as left and right click.
{{< figure= src="/images/keyboard/y-mode.svg">}}
```clojure
{:des "y-mode (mouse, screenshots)"
:rules [:y-mode
[:b {:modi :⇧⌘ :key :5}]
[:c {:modi :⇧⌘ :key :4}]
[:d {:mkey {:y 1500}}]
[:e {:mkey {:y 4500}}]
[:a {:mkey {:x -1500}}]
[:f {:mkey {:x 1500}}]
[:q {:mkey {:x -4500}}]
[:r {:mkey {:x 4500}}]
[:s {:mkey {:y -1500}}]
[:v {:modi :⇧⌘⌃ :key :4}]
[:w {:mkey {:y -4500}}]
[:z {:modi :⇧⌘ :key :3}]
[:right_command :button2]
[:return_or_enter :button1]
]}
```
`z-mode`: navigation {#z-mode-navigation}
Hold `z` for cursor movement. The four arrow keys sit on the home row: `j` (left), `k` (up), `l` (down), `;` (right). This is inspired by Vim's `h-j-k-l`, with two modifications. First, the cluster is shifted one key to the right so that it aligns with the natural resting position of the four fingers. Second, up and down are swapped: `k` is up and `l` is down, placing up next to left and down next to right. This grouping works better for movement commands that can be conceptualized as either horizontal or vertical—for example, "sentence backward" is both "up" and "left," while "sentence forward" is both "down" and "right."
Beyond the arrow keys, `y` and `h` provide page up/down and `m` and `/` provide home/end. In Emacs, the keys trigger custom navigation commands via the `A-C-s-` modifier prefix, enabling more granular movement (by paragraph, sentence, defun, etc.). The `##` prefix on some keys allows shift to be held for selection.
{{< figure= src="/images/keyboard/z-mode-emacs.svg">}}
{{< figure= src="/images/keyboard/z-mode-macos.svg">}}
```clojure
{:des "z-mode (navigation)"
:rules [:z-mode
[:b {:modi :A-C-s- :key :b} :emacs]
[:c {:modi :A-C-s- :key :c}]
[:d {:modi :A-H-M-s- :key :d} :emacs]
[:e {:modi :A-C-s- :key :e} :emacs]
[:f [{:modi :⌘⌃ :key :n} {:modi :⌘⌃ :key :h}] :chrome]
[:f {:modi :A-C-s- :key :f}]
[:g {:modi :A-C-s- :key :g} :emacs]
[:h {:modi :A-C-s- :key :h} :emacs]
[:##h :page_down]
[:i {:modi :A-C-s- :key :i} :emacs]
[:##i {:modi :⌥ :key :up_arrow}]
[:j :left_arrow]
[:##j :left_arrow]
[:k :up_arrow]
[:##k :up_arrow]
[:l :down_arrow]
[:##l :down_arrow]
[:m {:modi :A-C-s- :key :m} :emacs]
[:##m {:modi :⌘ :key :left_arrow}]
[:n {:modi :A-C-s- :key :n} :emacs]
[:o {:modi :A-C-s- :key :o} :emacs]
[:##o {:modi :⌥ :key :down_arrow}]
[:p {:modi :A-C-s- :key :p} :emacs]
[:##p {:modi :⌥ :key :right_arrow}]
[:r [{:modi :⌘⌃ :key :p} {:modi :⌘⌃ :key :h}] :chrome]
[:r {:modi :A-C-s- :key :r} :emacs]
[:s {:modi :A-C-s- :key :s} :emacs]
[:t {:modi :A-C-s- :key :t} :emacs]
[:u {:modi :A-C-s- :key :u} :emacs]
[:##u {:modi :⌥ :key :left_arrow}]
[:v {:modi :A-C-s- :key :v} :emacs]
[:w {:modi :A-C-s- :key :w} :emacs]
[:x {:modi :A-C-s- :key :x}]
[:y {:modi :A-C-s- :key :y} :emacs]
[:##y :page_up]
[:spacebar {:modi :A-C-s- :key :spacebar} :emacs]
[:spacebar {:modi :⌘ :key :up_arrow} :chrome]
[:spacebar :home]
[:comma {:modi :A-C-s- :key :comma} :emacs]
[:##comma {:modi :⌘ :key :up_arrow}]
[:period {:modi :A-C-s- :key :period} :emacs]
[:##period {:modi :⌘ :key :down_arrow}]
[:right_command {:modi :A-C-s- :key :tab} :emacs]
[:right_command {:modi :⌘ :key :down_arrow} :chrome]
[:right_command :end]
[:semicolon :right_arrow]
[:##semicolon :right_arrow]
[:slash {:modi :A-C-s- :key :slash} :emacs]
[:##slash {:modi :⌘ :key :right_arrow}]
[:tab {:modi :A-C-s- :key :tab} :emacs]
;; [:##tab :home]
]}
```
`,-mode`: transposition {#mode-transposition}
Hold `,` to trigger transposition commands in Emacs (transpose characters, words, lines, sentences, paragraphs, sexps, etc.) via the `A-H-M-` modifier prefix. This simlayer is Emacs-only.
{{< figure= src="/images/keyboard/comma-mode.svg">}}
```clojure
{:des "comma-mode (transposition)"
:rules [:comma-mode
[:a {:modi :A-H-M- :key :a}]
[:b {:modi :A-H-M- :key :b}]
[:c {:modi :A-H-M- :key :c}]
[:d {:modi :A-H-M- :key :d}]
[:e {:modi :A-H-M- :key :e}]
[:f {:modi :A-H-M- :key :f}]
[:g {:modi :A-H-M- :key :g}]
[:h {:modi :A-H-M- :key :h}]
[:i {:modi :A-H-M- :key :i}]
[:j {:modi :A-H-M- :key :j}]
[:l {:modi :A-H-M- :key :l}]
[:n {:modi :A-H-M- :key :n}]
[:o {:modi :A-H-M- :key :o}]
[:p {:modi :A-H-M- :key :p}]
[:q {:modi :A-H-M- :key :q}]
[:r {:modi :A-H-M- :key :r}]
[:s {:modi :A-H-M- :key :s}]
[:t {:modi :A-H-M- :key :t}]
[:u {:modi :A-H-M- :key :u}]
[:v {:modi :A-H-M- :key :v}]
[:w {:modi :A-H-M- :key :w}]
[:x {:modi :A-H-M- :key :x}]
[:y {:modi :A-H-M- :key :y}]
[:z {:modi :A-H-M- :key :z}]
[:semicolon {:modi :A-H-M- :key :semicolon}]
[:period {:modi :A-H-M- :key :period}]
[:slash {:modi :A-H-M- :key :slash}]
[:spacebar {:modi :A-H-M- :key :spacebar}]
[:return_or_enter {:modi :A-H-M- :key :return_or_enter}]
[:tab {:modi :A-H-M- :key :tab}]
]}
```
`.-mode`: text manipulation {#dot-mode-text-manipulation}
Hold `.` to trigger text manipulation commands (sorting, aligning, casing, commenting, etc.) via the `A-C-H-` modifier prefix.
{{< figure= src="/images/keyboard/period-mode.svg">}}
```clojure
{:des "period-mode (manipulation)"
:rules [:period-mode
[:a {:modi :A-C-H- :key :a}]
[:b {:modi :A-C-H- :key :b}]
[:c {:modi :A-C-H- :key :c}]
[:d {:modi :A-C-H- :key :d}]
[:e {:modi :A-C-H- :key :e}]
[:f {:modi :A-C-H- :key :f}]
[:g {:modi :A-C-H- :key :g}]
[:h {:modi :A-C-H- :key :h}]
[:i {:modi :A-C-H- :key :i}]
[:j {:modi :A-C-H- :key :j}]
[:k {:modi :A-C-H- :key :k}]
[:l {:modi :A-C-H- :key :l}]
[:m {:modi :A-C-H- :key :m}]
[:n {:modi :A-C-H- :key :n}]
[:o {:modi :A-C-H- :key :o}]
[:p {:modi :A-C-H- :key :p}]
[:q {:modi :A-H-M-s- :key :9}]
[:r {:modi :A-C-H- :key :r}]
[:s {:modi :A-C-H- :key :s}]
[:t {:modi :A-C-H- :key :t}]
[:u {:modi :A-C-H- :key :u}]
[:v {:modi :A-C-H- :key :v}]
[:w {:modi :A-C-H- :key :w}]
[:x {:modi :A-C-H- :key :x}]
[:y {:modi :A-C-H- :key :y}]
[:z {:modi :A-C-H- :key :z}]
[:semicolon {:modi :A-C-H- :key :semicolon}]
[:comma {:modi :A-C-H- :key :comma}]
[:slash {:modi :A-C-H- :key :slash}]
[:spacebar {:modi :A-C-H- :key :spacebar}]
[:return_or_enter {:modi :A-C-H- :key :return_or_enter}]
]}
```
`;-mode`: symbols {#mode-symbols}
Hold `;` for symbols that would normally require Shift + number row (e.g. `!@#$%^&*`), as well as brackets, braces, quotes, backtick, tilde, en-dash, and em-dash. This eliminates the need for a number row to access these characters.
{{< figure= src="/images/keyboard/semicolon-mode.svg">}}
```clojure
{:des "semicolon (symbols)"
:rules [:semicolon-mode
[:##a :percent_sign]
[:##b :grave_accent_and_tilde]
[:##c :open_bracket]
[:##d :close_parenthesis]
[:##e :number_sign]
[:##f :asterisk]
[:##g :caret]
[:##h :ampersand]
[:##i :open_single_quote]
[:##j :double_quote]
[:##k :open_double_quote]
[:##l :close_double_quote]
[:##m :hyphen]
[:##n :tilde]
[:##o :close_single_quote]
[:##q :exclamation_mark]
[:##r :dollar_sign]
[:##s :open_parenthesis]
[:##t :backslash]
[:##u :quote]
[:##v :close_bracket]
[:##w :at_sign]
[:##x :close_brace]
[:##y :vertical_bar]
[:##z :open_brace]
[:##comma :en_dash]
[:##period :em_dash]
[:##right_command :underscore]
]}
```
`/-mode`: org-mode {#mode-org-mode}
Hold `/` in Emacs to access org-mode commands via the `C-H-M-s-` modifier prefix.
{{< figure= src="/images/keyboard/slash-mode-emacs.svg">}}
```clojure
{:des "slash simlayer → org-mode"
:rules [:slash-mode
[:a {:modi :C-H-M-s- :key :a} :emacs]
[:b {:modi :C-H-M-s- :key :b} :emacs]
[:c {:modi :C-H-M-s- :key :c} :emacs]
[:d {:modi :C-H-M-s- :key :d} :emacs]
[:e {:modi :C-H-M-s- :key :e} :emacs]
[:f {:modi :C-H-M-s- :key :f} :emacs]
[:g {:modi :C-H-M-s- :key :g} :emacs]
[:h {:modi :C-H-M-s- :key :h} :emacs]
[:i {:modi :C-H-M-s- :key :i} :emacs]
[:j {:modi :C-H-M-s- :key :j} :emacs]
[:m {:modi :C-H-M-s- :key :m} :emacs]
[:n {:modi :C-H-M-s- :key :n} :emacs]
[:o {:modi :C-H-M-s- :key :o} :emacs]
[:p {:modi :C-H-M-s- :key :p} :emacs]
[:q {:modi :C-H-M-s- :key :q} :emacs]
[:r {:modi :C-H-M-s- :key :r} :emacs]
[:s {:modi :C-H-M-s- :key :s} :emacs]
[:t {:modi :C-H-M-s- :key :t} :emacs]
[:u {:modi :C-H-M-s- :key :u} :emacs]
[:v {:modi :C-H-M-s- :key :v} :emacs]
[:w {:modi :C-H-M-s- :key :w} :emacs]
[:z {:modi :C-H-M-s- :key :z} :emacs]
[:x {:modi :C-H-M-s- :key :x} :emacs]
[:y {:modi :C-H-M-s- :key :y} :emacs]
[:period {:modi :C-H-M-s- :key :period} :emacs]
]}
```
Summary {#summary}
This setup trades a steep learning curve for an extremely compact and efficient input system. The key design decisions are:
- **Minimal firmware, maximal software.** The keyboard firmware is nearly a no-op. All intelligence lives in Karabiner/Goku, which is easier to iterate on and works across keyboards.
- **Simlayers over firmware layers.** Instead of dedicated layer-switch keys, every letter can be a layer trigger. This is only possible because Karabiner's simlayer mechanism is time-based, distinguishing a quick tap from a held key.
- **Context-aware behavior.** The same physical key does different things in Emacs vs. other apps. This lets me have Emacs-native deletion, navigation, and transposition commands alongside standard macOS shortcuts.
- **Six Emacs modifiers from six thumb keys.** By mapping all six Emacs modifier keys to physical keys, I have a keybinding space large enough to support thousands of unique commands.]]></description></item><item><title>Paul Christiano on cause prioritization</title><link>https://stafforini.com/notes/paul-christiano-on-cause-prioritization/</link><pubDate>Mon, 24 Mar 2014 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/paul-christiano-on-cause-prioritization/</guid><description>&lt;![CDATA[Paul Christiano is a graduate student in computer science at UC Berkeley. His academic research interests include algorithms and quantum computing. Outside academia, he [has written](http://rationalaltruist.com/) about various topics of interest to effective altruists, with a focus on the far future.  Christiano holds a BA in mathematics from MIT and has represented the United States at the International Mathematical Olympiad. He is a Research Associate at the [Machine Intelligence Research Institute](http://intelligence.org/) and a Research Advisor at [80,000 Hours](http://80000hours.org/).
---
**Pablo**: To get us started, could you explain what you mean by 'cause prioritization', and briefly discuss the various types of cause prioritization research that are currently being conducted?
**Paul**: I mean research that helps determine which broad areas of investment are likely to have the largest impact on the things we ultimately care about. Of course a huge amount of research bears on this question, but I'm most interested in research that addresses its distinctive characteristics. In particular, I'm most interested in:
1. Research that draws on what is known about different areas in order to actually make these comparisons. I think [GiveWell Labs](http://www.givewell.org/givewell-labs) and the [Copenhagen Consensus Center](http://www.copenhagenconsensus.com/) are the two most salient examples of this work, though they have quite different approaches. I understand that the [Centre for Effective Altruism](http://centreforeffectivealtruism.org/) (CEA) is beginning to invest in this area as well. I think this is an area where people will be able to get a lot of traction (and have already done pretty well for the amount of investment I'm aware of) and I think it will probably go a very long way towards facilitating issue-agnostic giving.
2. Research that aims to understand and compare the long-term impacts of the short-term changes which our investments can directly bring about. For example, research that clarifies and compares the long-term impact of poverty alleviation, technological progress, or environmental preservation, and how important that long-term impact is. This is an area where answers are much harder to come by, but even slight improvements in our understanding would have significant importance for a very broad range of decisions. It appears that high-quality work in this area is pretty rare, though it's a bit hard to tell if this is due to very little investment or if this is merely evidence that making progress on these problems is too difficult. I tend to lean towards the former, because (a) we see very little public discussion of process and failed attempts for high-quality research on these issues, which you should expect to see even if they are quite challenging, and (b) this is not an area that I expect to receive a lot of investment except by cause-agnostic altruists who are looking quite far ahead. I think the most convincing example to date is Nick Bostrom's [astronomical waste argument](http://intelligence.org/files/AstronomicalWaste.pdf) and Nick Beckstead's [more extensive discussion of the importance of the far future](https://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsdGRvbWFpbnxuYmVja3N0ZWFkfGd4OjExNDBjZTcwNjMxMzRmZGE), which seem to take a small but reasonably robust step towards improving our understanding of what to do.
There are certainly other relevant research areas, but they tend to be less interesting as cause prioritization per se. For example, there is a lot of work that tries to better understand the impact of particular interventions. I think this is comparably important to (1) or (2), but that it currently receives quite a lot more attention at the moment and it's not clear that a cause-agnostic philanthropist would want to change how this is being done. More tangentially, efforts to improve forecasts more broadly have significant relevance for philanthropic investment, though they are even more important in other domains so prima facie it would be a bit surprising if these efforts ought to be a priority by virtue of their impact on improving philanthropic decision-making.
**Pablo**: In public talks and private conversation, you have argued that instead of supporting any of the object-level interventions that look most promising on the evidence currently available, we should on the current margin invest in research on understanding which of those opportunities are most effective.  Could you give us an outline of this argument?
**Paul**: It seems very likely to me that more research will lead to a much clearer picture of the relative merits of different opportunities, so I suspect in the future we will be much better equipped to pick winners. I would not be at all surprised if supporting my best guess charity ten years from now was several times more impactful than supporting my best guess charity now.
If you are this optimistic about learning more, then it is generally better to donate to your best guess charity in a decade, rather than donating to your current best guess. But if you think there is room for more funding to help accelerate that learning process, then that might be an even better idea. I think this is the case at the moment: the learning process is mostly driven by people doing prioritization research and exploratory philanthropy, and total investment in that area is not very large.
Of course, giving to object level interventions may be an important part of learning more, and so I would be hesitant to say that we should avoid investment in object-level problems. However, I think that investment should really be focused on learning and exploring (in a way that can help other people make these decisions as well, not just the individual donor) rather than for a direct positive impact.  So for example I'm not very interested in scaling up successful global health interventions.
The most salient motivation to do good now, rather than learning or waiting, is a discount rate that is much steeper than market rates of return.
For example, you might give now if you thought your philanthropic investments would earn very high effective rates of return. I think this is unlikely for the kinds of object-level investments most philanthropists consider--I think most of these investments compound roughly in line with the global growth rate (which is smaller than market rates of return).
You might also have a high discount rate if you thought that the future was likely to have much worse philanthropic opportunities; but as far as I can tell a philanthropist today has just as many problems to solve as a philanthropist 20 years ago, and frankly I can see a lot of possible problems on the horizon for a philanthropist to invest in, so I find this compelling.
Sometimes “movement-building” is offered as an example of an activity with very high rates of returns. At the moment I am somewhat skeptical of these claims, and my suspicion is that it is more important for the “effective altruism” movement to have a fundamentally good product and to generally have our act together than for it to grow more rapidly, and I think one could also give a strong justification for prioritization research even if you were primarily interested in movement-building. But that is a much longer discussion.
**Pablo**: I think it would be interesting to examine more closely the object-level causes supported by EAs or proto-EAs in the past (over, say, the last decade), and use that examination to inform our estimates about the degree to which the value of future EA-supported causes will exceed that of causes that EAs support today.  Off the top of my head, the EAs I can think of who have donation records long enough to draw meaningful conclusions all have in the past supported causes that they would now regard as being significantly worse than those they currently favour.  So this would provide further evidence for one of the premises in your argument: that cause prioritization research can uncover interventions of high impact relative to our current best guesses.
The other premise in your argument, as I understand it, is that the value of the interventions we should expect cause prioritization research to uncover is high relative to the opportunity cost of current spending. Can you elaborate on the considerations that are, in your opinion, relevant for assessing this premise?
**Paul**: Sorry, this is going to be a bit of a long and technical response. I see three compelling reasons to prefer giving today to giving in the future. But altogether they don't seem to be a big deal compared to how much more we would expect to know in the future. Again, I think that using giving as an opportunity to learn stands out as an exception here--because in that case we can say with much more confidence that we will need to learn more at some point, and so the investment today is not much of a lost cause.
1. The actual good you do in the world compounds over time, so it is better to do good sooner than later.
2. There are problems today that won't exist in the future, so money in the future may be substantially less valuable than money today.
3. In the future there will be a larger pool of “smart money” that finds the best charitable opportunities, so there will be fewer opportunities to do good.
Regarding (1), I think that the vast majority of charitable activities people engage in simply do not compound that quickly. To illustrate, you might consider the case of a cash transfer to a poor family. Initially such a cash transfer earns a very high rate of return, but over time the positive impact diffuses over a broader and broader group of people. As it diffuses to a broader group, the returns approach the general rate of growth in the world, which is substantially smaller than the interest rate. Most other forms of good experience a very similar pattern. So if this were the only reason to give sooner, then I think that you would actually be better served by saving and earning prevailing interest rates for an extra year, and then donating a year later--even if you didn't expect to learn anything new.
A mistake I sometimes see people make is using the initial rates of return on an investment to judge its urgency. But those returns last for a brief period before spreading out into the broader world, so you should really think of the investment as giving you a fixed multiplier on your dollar before spreading out and having a long-term returns that go like growth rates. It doesn't matter whether that multiplier accrues instantaneously or over a period of a few years during which you enjoy excess returns. In either case the magnitude of the multiplier is not relevant to the urgency of giving, just whether the multiplier is going up or down.
A category of good which is plausibly exceptional here is creating additional resources that will flexibly pursue good opportunities in the future. I'm aware that some folks around CEA assign very high rates of return, in excess of 30% / year, to investment in movement-building and outreach. I think this is an epistemic error, but that would probably be a longer discussion so it might be easier to restrict attention to object-level interventions vs. focusing on learning.
Regarding (2), I don't really see the evidence for this position. From my perspective the problems the world faces today seem more important--in particular, they have more significant long-term consequences--than the problems the world faced 200 years ago. It looks to me like this trend is likely to continue, and there is a good chance that further technological development will continue to introduce problems with an unprecedented potential impact on the future. So with respect to object-level work I'd prefer to address the problems of today than the problems of 200 years ago, and I think I'd probably be even happier addressing the problems we face 50 years.
Regarding (3), I do see this as a fairly compelling reason to move sooner rather than later. I think the question is one of magnitudes: how long do you expect it will take before the pool of “smart money” is twice as large? 10 times larger? I think it's very easy to overestimate the extent to which this group is growing. It is only at extremely exceptional points in history that this pool can grow 20% faster than economic growth. For example, if you are starting from a baseline of 0.1% of total philanthropic spending, that can only go up 20% per year for 40 years or so before you get to 100% of spending. On the flip side, I think it is pretty easy to look around at what is going on locally and mistakenly conclude that the world must be changing pretty rapidly.
I think most of what we are seeing isn't a changing balance between smart money and ordinary folk, it's continuously increasing sophistication on the part donors collectively--this is a process that can go on for a very long time. In this case, it's not so clear why you would want to give earlier when you are one of many unsophisticated donors rather than giving later when you are one of many sophisticated donors, even if you were only learning as fast as everyone else in the world. The thing that drives the discount rate earlier was the belief that other donors were getting sophisticated faster than we are, so that our relative importance was shrinking. And that no longer seems to apply when you look at it as a community increasing in sophistication.
So overall, I see the reasons for urgency in giving to be relatively weak, and I think the question of whether to give or save would be ambiguous (setting aside psychological motivations, the desire to learn more, and social effects of giving now) even if we weren't learning more.
**Pablo**: Recently, a few EAs have questioned that charities vary in cost-effectiveness to the degree that is usually claimed within the EA community.  Brian Tomasik, for instance, argues that [charities differ by at most 10 to 100 times](http://utilitarian-essays.com/why-charities-do-not-differ-astronomically.html) (and much less so within a given field). Do you think that arguments of this sort could weaken the case for supporting research into cause prioritization, or change the type of cause prioritization research that EAs should support?
**Paul**: I think there are two conceptually distinct issues here which should be discussed separately, at least in this context.
One is the observation that a small group looking for good buys may not have as large an influence as it seems, if they will just end up slightly crowding out a much larger pool of thoughtful money. The bar for “thoughtful” isn't that high, it just needs to be sensitive to diminishing returns in the area that is funded. There are two big reasons why this is not so troubling:
- Money that is smart enough to be sensitive to diminishing marginal returns--and moreover which is sufficiently cause-agnostic to move between different fields on the basis of efficiency considerations--is also likely to be smart enough to respond to significant changes in the evidence and arguments for a particular intervention. So I think doing research publicly and contributing to a stock of public knowledge about different causes is not subject to such severe problems.
- The number of possible giving opportunities is really quite large compared to the number of charitable organizations. If you are looking for the best opportunities, in the long-term you are probably going to be contributing to the existence of new organizations working in areas which would not otherwise exist. This is especially true if we expect to use early investigations to help direct our focus in later investigations. This is very closely related to the last point.
This issue is most severe when we consider trying to pursue idiosyncratic interests, like an unusually large degree of concern for the far future. So this consideration does make me a bit less enthusiastic about that, which is something I've written about before. Nevertheless, I think in that space there are many possible opportunities which are simply not going to get any support from people who aren't thinking about the far future, so there still seems to be a lot of good to do by improving our understanding.
A second issue is that broad social improvements will tend to have a positive effect on society's ability to resolve many different problems. So if there is any exceptionally impactful thing for society to do, then that will also multiply the impact of many different interventions. I don't think this consideration says too much about the desirability of prioritization: quite large differences are very consistent with this observation, these differences can be substantially compounded by uncertainty about whether the indirect effects of an intervention are good and bad, and there is substantial variation even in very broad measures of the impact of different interventions. This consideration does suggest that you should pay more attention to very high-impact interventions even if the long-term significance of that impact is at first ambiguous.
**Pablo**: Finally, what do you think is the most effective way to promote cause prioritization research?  If an effective altruist reading this interview is persuaded by your arguments, what should this person do?
**Paul**: One conclusion is that it would be premature to settle on an area that currently looks particularly attractive and simply scale up the best-looking program in that area. For example, I would be hesitant to support an intervention in global health (or indeed in most areas) unless I thought that supporting that intervention was a cost-effective way to improve our understanding of global health more broadly. That could be because executing the intervention would provide useful information and understanding that could be publicly shared, or because supporting it would help strengthen the involvement of EA's in the space and so help EA's in particular improve their understanding. One could say the same thing about more speculative causes: investments that don't provide much feedback or help us understand the space better are probably not at the top of my priority list.
Relatedly, I think that global health receives a lot of attention because it is a particularly straightforward area to do good in; I think that's quite important if you want your dollar to do as much good directly as possible, but that it is much less important (and important in a different way) if you are paying appropriate attention to the value of learning and information.
Another takeaway is that it may be worth actively supporting this research, either by supporting organizations that do it or by giving on the basis of early research. I think Good Ventures and GiveWell Labs are currently the most credible effort in this space (largely by virtue of having done much more research in this space than any other comparably EA-aligned organization), and so providing support for them to scale up that research is probably the most straightforward way to directly support cause prioritization. There are some concerns about GiveWell Labs capturing only half of marginal funding, or about substituting with Good Ventures' funding; that would again be a longer and much more complicated discussion. My view would be that those issues are worth thinking about but probably not deal-breakers.
I hear that CEA may be looking to invest more in this area going forward, and so supporting CEA is also a possible approach. To date they have not spent much time in this area and so it is difficult to predict what the output will look like. To the extent that this kind of chicken-and-egg problem is a substantial impediment to trying new things faster and you have confidence in CEA as an organization, providing funding to help plausible-looking experiments get going might be quite cost-effective.
A final takeaway is that the balance between money and human capital looks quite different for philanthropic research than for many object-level interventions. If an EA is interested in scaling up proven interventions, it's very likely that their comparative advantage is elsewhere and they are better served by [earning money and distributing it to charities doing the work they are most excited about](http://www.effective-altruism.com/category/earning-to-give/). But if you think that increasing philanthropic capacity is very important, it becomes more plausible that the best use of time for a motivated EA is to work directly on related problems. That might mean working for an “EA” organization, working within the philanthropy sector more broadly, or pursuing a career somewhere else entirely. Once we are talking about applying human capital rather than money, ability and enthusiasm for a particular project becomes a very large consideration (though the kinds of considerations we've discussed in this interview can be another important input).]]></description></item><item><title>How can doctors do the most good? An interview with Dr Gregory Lewis</title><link>https://stafforini.com/notes/how-can-doctors-do-the-most-good-an-interview-with-dr-gregory-lewis/</link><pubDate>Fri, 28 Aug 2015 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/how-can-doctors-do-the-most-good-an-interview-with-dr-gregory-lewis/</guid><description>&lt;![CDATA[Gregory Lewis is a public health doctor training in the east of England. He studied medicine at Cambridge, where he volunteered for Giving What We Can and 80000 hours. He blogs at [The Polemical Medic](http://www.thepolemicalmedic.com/). This interview was conducted as part of the research I did for Will MacAskill's book, _[Doing Good Better: How Effective Altruism Can Help You Make a Difference](http://www.effectivealtruism.com/)_. Greg's inspiring story is discussed in chapters 4 and 5 of that book.
---
**Pablo Stafforini**: To get us started, can you tell us a bit about your background, and in particular about your reasons for deciding to become a doctor?
**Gregory Lewis**: Sure. I guess I found myself at the age of 14 or so being fairly good at science and not really having any idea of what to do with myself. I had some sort of vague idea of trying to want to make the world a better place, in some slightly naive way. So I sort of thought, “What am I going to do with myself?” And my thoughts were pretty much _verbatim_, “Well, I'm good at science and want to do good. Doctors are good at science and they want to do good. Therefore, I want to be a doctor.” So based on that simple argument, I applied to medical school, got in, spent the following six years of my life in medical school qualifying as a doctor, and here I am today.
**Pablo**: I suppose that at some point between the age of fourteen and the present you changed your mind to some degree about the kind and amount of good that doctors could do. Can you elaborate on that?
**Greg**: Yes. One of my interests outside medicine was philosophy -- I almost studied philosophy at university, but I thought I could do more good as a doctor. It was through this I read Peter Unger's book _[Living High and Letting Die](http://www.amazon.com/Living-High-Letting-Die-Innocence/dp/0195108590/ref=sr_1_1?ie=UTF8&qid=1439333672&sr=8-1&keywords=living+high+and+letting+die&tag=s4charity-20)_. This book opened my eyes to the importance and moral significance of giving substantially to charity, and I took this message to heart. But I didn't really link it up with what I'd planned in my career: I thought I would heal the sick (if you'll excuse the expression) in my day job, and the good I would do by giving a lot to charity would be an added bonus.
It took a couple of years, and coming into contact with people like Toby Ord and Will McAskill, to begin to put these things together, and look again at my plans to be a doctor. How much good did doctors really do and (more importantly) how did it stack up in comparison with all the other things I could do instead? So I began to look at this question and found (somewhat to my disappointment) that working as a doctor doesn't fare well in this sort of comparison.
**Pablo**: You mention Unger's book, and I recall that in that book the argument for earning to give is [briefly sketched](http://www.jefftk.com/p/history-of-earning-to-give). Did you notice that argument when reading the book, or did you just focus on the message that you should be donating a big chunk of the money you'd expect to earn in your current career (as opposed to switching to an even more lucrative career)?
**Greg**: Yes, I remember reading those couple of paragraphs where he suggests that philosophers should consider moving out of academia into corporate law or more lucrative fields, so that they would have more money to give away. That sounded right to me back then, but I didn't really see medicine at the time as an ‘earning to give' career—I thought the direct impacts of medicine were substantial, so a medical career got the ‘best of both worlds', and the money I would give away would be an ‘added bonus' to the direct work as a medical doctor. It took getting more involved with the effective altruism community to think I should try and combine these two worlds, and that I should try and weigh up how much good doctors do versus how much good donations do, and plan my career accordingly.
**Pablo**: So when you started to think more systematically about the amount of good that doctors could do, do you think you encountered any internal resistance to the possible conclusion that this amount might not be as high as you had assumed initially?
**Greg**: The seeds of scepticism were sown fairly early in my training. Doctors themselves generally are fairly cynical of the good they do, and when they talk about ‘healing the sick', it is with tongue firmly in cheek. One conversation I remember clearly (and with retrospect I wish I paid attention to more) was talking to a doctor in paediatrics, who said something along the lines of, “I don't ever feel like I'm saving lives or making a big difference, because although I might be the guy giving the life-saving treatments, if I wasn't there they would have called the doctor just down the hall, who would have done exactly the same as I.”
I gradually internalized this more realistic view on how much good I could do as a doctor. This was somewhat disappointing to me, but I wasn't that phased by it. Maybe the world is just set up that it's really hard to make a big difference, and if the best I could hope for was to make a more modest contribution, that is still definitely ‘worth it', and (like many other doctors) I decided my prior zeal to heal the sick and save the world was quixotic, immature, and naive: “The mark of an immature man is the desire to die nobly for a cause, whilst the mark of the mature man is the desire to live humbly for one.” I flirted with the idea of working for Médecins Sans Frontières (MSF) or abroad, but I wasn't thinking systematically.
So one of the major upsides of reading _Living High and Letting Die_ was finding out my 17 year old self wasn't so unrealistic in hoping to save hundreds or thousands of lives—things that good are within our reach. The downside was this would happen through a very different channel. Rather than 17-year-old me's vainglorious visage of (thousands of times over!) striding in, white coat billowing, and saving some stricken patient with my cleverness, it would be me posting a cheque or clicking a bank transfer: I'd know abstractly that this would do so much good, but I wouldn't be able to point to the person it was that I helped. As it turns out, that's no big deal—especially compared to the sheer magnitude of good done.
**Pablo**: Your conclusion that you wouldn't in your capacity as a doctor be doing as much good as alternative paths to impact appears to involve both a premise about the amount of good doctors typically do and a premise about the amount of good that such people can do in other ways. That is, you seem to be claiming that doctors do less good directly than people assume, but also that people, including doctors, can do much more good than they think by donating to the right causes. Is that correct?
**Greg**: Yes. The major upshot of the work I've done into how much good a doctor does is that the average doctor probably saves around a handful of lives over their career. So that's bad news for medics. By contrast, giving fairly small amounts to charity can save hundreds of lives (or maybe more) over your working life, and that's good news for everyone!
**Pablo**: Let's zoom in on your work about the good doctors do. Insofar as it's possible to discuss these issues in an informal interview, without having all the relevant figures in front of us, can you sketch the argument for the conclusion that a doctor saves about 200 DALYs over the course of his or her career?
**Greg**: Sure. I started looking at the research literature expecting there would be a lot of work done on the ‘return' of having more doctors—I figured this would be important to running a health system, or something more introspective members of the profession would have wanted to find out. As it happened, there was basically no work looking at the question: “How much good does a doctor do?”
The closest is [work by an epidemiologist called John Bunker](http://ije.oxfordjournals.org/content/30/6/1260.full): he and colleagues were looking at the question of how much of the dramatic gains in health and life expectancy in the western world could be attributed to medical treatment. Their strategy was to look at the few hundred or so commonest medical interventions: fixing broken bones, treating heart attacks, stuff like that. For each of these, they looked at clinical trials to see how much good each of those things did, and, by adding them together, work out how much good medicine as a whole does. You can extrapolate from their figures to many healthy life years (a measure of length and quality of life---you can think of 30 health years as ‘one life saved') are added to the population by the medical profession, and then divide by the number of doctors to get the ‘years added per doctor' This is about 2250 health-years saved per medical career---that's pretty good, about 80 lives.
There are several reasons to suspect this is an overestimate. One of the big ones is that the difference of a doctor should be _on the margin_. Although the first few doctors should be able to make a massive difference, subsequent doctors (like being the 170001st in the UK) should make a smaller difference, as all the easy ways to make people live longer and healthier should already be being done. If I were removed from my post, there wouldn't be a ‘Greg shaped hole' in the hospital where all my patients are not treated. Rather the remaining doctors will reallocate their tasks so only the least important things don't get done.
So I began to attack this problem from the ‘top down' rather than from the ‘bottom up'. Instead of compiling an inventory of medical treatment, I looked at aggregate measures of health and physician density, and looked at the relationship between them: looking at all the countries in the globe, did having more doctors per capita correspond to lower burdens of death and disability? The answer was ‘yes', but there were diminishing returns. What I then did was fit a best line to this curve, and work out, if you were in the UK and you added one more doctor to the population, how much further along the curve do you go, and how much does disability fall? This figure is smaller, of the order of 400-800 health-years averted per medical career: 20 to 30 lives.
This figure, however, is also going to be an overestimate, because we implicitly ignore confounding factors—there are fairly obvious things that will increase both health and physician density, like wealth, sanitation, or education. Indeed, it's received wisdom that these ‘social determinants of health' are far more important than doctors. Happily, international data on these factors are also available, and one can try and tease apart these interrelationships by a technique called regression analysis. This gives a smaller figure still. The average doctor averts 200 or so DALYs per medical career—six lives or so.
There are all sorts of caveats with this sort of work---the data is fair but not great, it is fundamentally an observational study, and there's always the spectre of unaccounted for confounds. Despite these concerns, I'd be surprised if this figure was off by an order of magnitude or more. If anything, this already fairly low estimate is also over-optimistic: two big factors would be that I'm ignoring/counterfactuals/ and _elasticity_ (if I never went to medical school, there wouldn't be ‘one fewer doctor', it would be more like ‘I would be replaced by the marginal candidate who just missed out on med school); even worse, physician density is serving as a proxy for ‘medical professional density', from nurses, to hospital cleaners, to laboratory scientists. It's implausible that doctors can take all of the credit, or even a majority of it. So even if doctors have the largest impact out of all the health professions, one is still looking at another adjustment down, by a factor of at least two
**Pablo**: How much could altruistically motivated doctors boost that figure if they targeted their efforts more intelligently, e.g. by working in a less developed country or in a more lucrative specialty?
**Greg**: That was the question I asked myself next: given this is the average impact of a doctor, how could I try to do better than average? This is tricky, as the ‘top down' technique I used to find the average is too coarse-grained to answer these questions: there isn't the data, for example, to work out whether the marginal impact of cardiologists is greater than colorectal surgeons, or things like that.
One strategy could be to exploit the ‘diminishing returns' effect and go somewhere where the curve is steeper and so there are increasing benefits to having ‘an additional doctor'—this really crudely models ‘a career spent working in MSF' or with a similar NGO. This does give a bigger impact, by a factor of 10 or so.
However, the chequebook can likely beat the stethoscope, even one wielded by an MSF doctor. The average doctor in the UK will earn around 2.5 million pounds over their lifetime. Giving 10% of this to the right interventions will still ‘beat' an additional doctor abroad. And one can always give more than 10%, and although that is hard, it may not be as hard as spending one's career in the developing world.
The next question—going back to Unger—is whether there are particularly lucrative medical careers one could target with the aim of giving more away. And there are, at least when working in the Western world. To give the UK as an example, average consultant earnings by specialty vary by a factor of 3 or so, and the main determinant of this variation is the capacity that specialty has for private practice: you can't really work privately as an emergency physician, but one can work wholly outside the NHS as a plastic surgeon. So medicine is a fairly good earning to give option, although it is worth noting that if earning to give ‘beats' direct impact by a large margin, it perhaps would be even better to attempt to work in even more lucrative careers outside of medicine.
Finally, there are ‘peri-medic' roles that could be really important but hard to quantify: the chief medical officer for the NHS (or for the WHO), the researcher who makes the breakthrough for a malaria vaccine could have massive impact, so much so that it might be worth attempting even if it is a very long shot and one is likely to achieve something far more modest. It's pretty hard to quantify these considerations, but they look like career paths that could be even better than earning to give.
Perhaps the upshot is that direct work as a doctor is relatively small-fry compared to what you could do instead. Which ‘instead', though, remains very difficult to work out.
**Pablo**: To wrap up, you mentioned that you are currently giving about 50% of your income to cost-effective charities. Can you elaborate on what motivated you to give away such a big portion of your income and whether you find that difficult on a personal level?
**Greg**: Sure. So, given what I'd read by Unger, and in philosophy more generally, giving a lot to charity seemed a bit of a moral no-brainer. On the one side, several lives in my first year of being employed (and several thousand over my career as my salary grows), and on the other side, not a huge amount. So I committed to give 10% of my earnings whilst I was a medical student.
It became even more of a no-brainer when I actually started working. I am privileged in all manner of ways, but not least in that I live without dependents in a modern liberal democracy with almost double the median income of my country, and so among the top few percent of the planet by wealth. I found living similarly (but still better) than I did as a medical student left me with almost half my paycheck. So I started giving 10%, and have steadily increased this month on month until now I'm giving about 50%.
It's only at this larger proportion that there's any real personal ‘sacrifice' on my part: I now plan journeys in advance, keep a monthly budget, and don't reflexively eat out whenever the opportunity presents itself. I also haven't (as some of my colleagues have) got a BMW on franchise, or regularly holiday across the world. I don't really miss these luxuries, especially as these sacrifices are made without choice by most people living in the UK (and the globe), including people who work much harder and longer than I do alongside me in hospital.
I'm still in the wealthiest 10% of people on the planet. More importantly, I still get to keep the things that really matter: family, friends, literature, music, a career that, even though it might not save the world, is immensely personally fulfilling and interesting. Even better, I am happy I am doing something significant in making the world go better. I think the 17 year old me who wanted to be a doctor would be happy, but surprised, at the doctor he turned into.
**Pablo**: Awesome. Thanks, Greg!
_Crossposted to [80,000 Hours](https://80000hours.org/2015/08/how-can-doctors-do-the-most-good-an-interview-with-dr-gregory-lewis/)_]]></description></item><item><title>Good Done Right</title><link>https://stafforini.com/notes/good-done-right/</link><pubDate>Sat, 01 Feb 2020 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/good-done-right/</guid><description>&lt;![CDATA[Good Done Right was a conference on effective altruism held at All Souls College, Oxford on 7-9 July, 2014. It was perhaps the very first conference of its kind, and it featured an impressive roster of speakers. Some of the talks explored topics, such as moral trade, that would later become more widely discussed. One of these presentations was so good that I decided to [transcribe it](/notes/crucial-considerations-and-wise-philanthropy-by-nick-bostrom/).
Recordings of all the talks were subsequently [made available](https://80000hours.org/2014/07/good-done-right-audio-recordings-now-online/) on a website dedicated to the conference. Unfortunately, the website has since gone offline, and the Internet Archive hasn't indexed it properly. I contacted Andreas Mogensen, the conference organizer, and he supplied me with an image of the original conference poster (displayed below), but noted that he was no longer in possession of any of the audio recordings. Andreas also clarified that the list of speakers in the poster doesn't quite match the list of people that actually spoke at the event: Thomas Pogge didn't speak, whereas Elizabeth Ashford and Michelle Hutchinson did.
After a bit of detective work, I managed to locate recordings of most of these presentations. At the time of writing, three of these are on a [Soundcloud channel](https://soundcloud.com/gooddoneright/tracks) devoted to the conference, and most of the others are preserved by [EA Talks](https://www.eatalks.org/1755269/episodes). All these talks are listed below, in alphabetical order. I also obtained a number of photos of the event taken by Toby Ord, who kindly gave permission to share them here. I haven't been able to find recordings of the talks by Larissa MacFarquhar and Derek Parfit (as [the 80,000 Hours announcement confirms](https://80000hours.org/2014/07/good-done-right-audio-recordings-now-online/), both MacFarquhar and Parfit did participate in the event). However, upon noticing this post, Matthew van der Merwe reached out to me and generously shared the pdf Parfit used as the basis for his presentation (which he obtained from Parfit himself). I include a link to this text file as a substitute for the missing audio file. If anyone else not listed in the conference programme spoke at the conference, besides Ashford and Hutchinson, I haven't been able to find traces of their presentations.
- Elizabeth Ashford, [The allowing of severe poverty as the discarding of persons' lives](https://www.eatalks.org/1755269/episodes/8656192-good-done-right-elizabeth-ashford-the-allowing-of-severe-poverty-as-the-discarding-of-persons-lives)
- Nick Beckstead, [How can a long-run perspective help with strategic cause selection?](https://www.eatalks.org/1755269/episodes/8656185-good-done-right-nick-beckstead-how-can-a-long-run-perspective-help-with-strategic-cause-selection)
- Nick Bostrom, [Crucial considerations and wise philanthropy](https://soundcloud.com/gooddoneright/nick-bostrom-crucial-considerations-and-wise-philanthropy)
- Owen Cotton-Barratt, [Prioritising under uncertainty](https://soundcloud.com/gooddoneright/owen-cotton-barratt-prioritising-under-uncertainty)
- Norman Daniels, [Cost-effectiveness analysis and prioritization](https://www.eatalks.org/1755269/episodes/8656187-good-done-right-norman-daniels-cost-effectiveness-analysis-and-prioritization)
- Rachel Glennerster, [Using cost-effectiveness vs cost-benefit analysis in decision-making](https://www.eatalks.org/1755269/episodes/8656191-good-done-right-rachel-glennerster-using-cost-effectiveness-vs-cost-benefit-analysis-in-decision-making)
- Michelle Hutchinson, ["Too Young to Die?" -- How valuable is it to extend lives of different ages?](https://www.eatalks.org/1755269/episodes/8656190-good-done-right-michelle-hutchinson-too-young-to-die-how-valuable-is-it-to-extend-lives-of-different-ages)
- Jeremy Lauer, [Priority setting in the health sector using cost-effectiveness -- The WHO-CHOICE approach](https://www.eatalks.org/1755269/episodes/8656188-good-done-right-jeremy-lauer-priority-setting-in-the-health-sector-using-cost-effectiveness-the-who-choice-approach)
- Will MacAskill, [Effective altruism: the very idea](https://www.eatalks.org/1755269/episodes/8656193-good-done-right-will-crouch-effective-altruism-the-very-idea)
- Toby Ord, [Moral trade](https://soundcloud.com/gooddoneright/toby-ord-moral-trade)
- Derek Parfit, [How we can avoid the Repugnant Conclusion](http://stafforini.com/docs/Parfit%20-%20How%20can%20we%20avoid%20the%20Repugnant%20Conclusion.pdf)
{{< figure= src="/ox-hugo/conference-poster.png" alt="Good Done Right conference poster listing speakers and schedule">}}
{{< figure= src="/ox-hugo/derek-parfit-at-lectern-closeup.jpg" alt="Derek Parfit speaking at the lectern, All Souls College, Oxford">}}
{{< figure= src="/ox-hugo/derek-parfit-at-lectern-wide.jpg" alt="Derek Parfit at the lectern in the wood-paneled hall of All Souls College">}}
{{< figure= src="/ox-hugo/derek-parfit-by-stained-glass.jpg" alt="Derek Parfit preparing his presentation beside stained glass windows at All Souls College">}}
{{< figure= src="/ox-hugo/conference-audience.jpg" alt="Audience at the Good Done Right conference in the vaulted hall of All Souls College">}}
{{< figure= src="/ox-hugo/conference-dinner.jpg" alt="Conference dinner in the great hall of All Souls College, Oxford">}}]]></description></item><item><title>C. D. Broad: a bibliography</title><link>https://stafforini.com/notes/c-d-broad-a-bibliography/</link><pubDate>Tue, 28 May 2013 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/c-d-broad-a-bibliography/</guid><description>&lt;![CDATA[{{< figure= src="/ox-hugo/c-d-broad-portrait.jpg" alt="C. D. Broad">}}<div class="verse">
Socius, lector, thesaurarii iunioris inter belli<br/>
angustias uice functus, moralis philosophiae in<br/>
academia professor disciplinae illius alias quoque<br/>
partes singulari acumine et diligentia lucidissime<br/>
tractauit. Non minus se ipsum quam alios nouerat.<br/>
In sermone plus salis quam fellis habuit. Sueciae<br/>
amorem prae se tulit. Huic collegio studuit opera<br/>
consiliis testamento sustinendo. Vita decessit<br/>
A.S.mcmlxxi suae aetatis lxxxiv<br/></div>
This bibliography is based on (1) C. Lewy's 'Writings of C. D. Broad, to the end of July 1959', in Paul A. Schilpp (ed.) _The philosophy of C. D. Broad_, New York: Tudor Publishing Company, 1959, pp. 833–852; (2) Andrew Chrucky's [Works by C. D. Broad](https://www.ditext.com/broad/bybroad.html); and (3) my own research online and at various British libraries. It is my best attempt to make Broad's writings freely available on the web. Corrections and additions are welcome.
---
1906 {#1906}
- {{< cite= "Broad1906PhilosophyOmarKhayyam"=>}}
1912 {#1912}
- {{< cite= "Broad1912ReviewProceedingsAristotelian"=>}}
- {{< cite= "Broad1912ReviewSorleyMoral"=>}}
- {{< cite= "Broad1912ReviewBoodinTruth"=>}}
- {{< cite= "Broad1912ReviewWelbySignifics"=>}}
- {{< cite= "Broad1912ReviewJamesWard"=>}}
1913 {#1913}
- {{< cite= "Broad1913CriticalNoticeMeinong"=>}}
- {{< cite= "Broad1913NoteAchillesTortoise"=>}}
- {{< cite= "Broad1913ReviewRogersShort"=>}}
- {{< cite= "Broad1913LordHughCecil"=>}}
- {{< cite= "Broad1913CriticalNoticeCournot"=>}}
- {{< cite= "Broad1913ReviewProceedingsAristotelian"=>}}
- {{< cite= "Broad1913ReviewWalterMarvin"=>}}
- {{< cite= "Broad1913ReviewWildonCarr"=>}}
1914 {#1914}
- {{< cite= "Broad1914PerceptionPhysicsReality"=>}}
- {{< cite= "Broad1914CriticalNoticeAloys"=>}}
- {{< cite= "Broad1914MattersMainlyMental"=>}}
- {{< cite= "Broad1914HintsSocialAspirants"=>}}
- {{< cite= "Broad1914HeartEmperorTrue"=>}}
- {{< cite= "Broad1914DoctrineConsequencesEthics"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 17–42"=>}}
- {{< cite= "Broad1914CriticalNoticeCh"=>}}
- {{< cite= "Broad1914ReviewEncyclopaediaPhilosophical"=>}}
- {{< cite= "Broad1914ReviewProceedingsAristotelian"=>}}
- {{< cite= "Broad1914ReviewBottinelliCournot"=>}}
- {{< cite= "Broad1914ReviewSteinmannUber"=>}}
- {{< cite= "Broad1914MrBradleyTruth"=>}}
- {{< cite= "Broad1914ReviewAlfredRobb"=>}}
- {{< cite= "Broad1914ReviewLeonardNelson"=>}}
- {{< cite= "Broad1914ReviewEncyclopaediaPhilosophical"=>}}
- {{< cite= "Broad1914ReviewLeonidGabrilovitsch"=>}}
- {{< cite= "Broad1914ReviewCouturatAlgebra"=>}}
- {{< cite= "Broad1914ReviewGustavHeim"=>}}
1915 {#1915}
- {{< cite= "Broad1915CriticalNoticeFederigo"=>}}
- {{< cite= "Broad1915CriticalNoticeAliotta"=>}}
- {{< cite= "Broad1915CriticalNoticeBertrand"=>}}
- {{< cite= "Broad1915ReviewWildonCarr"=>}}
- {{< cite= "Broad1915ExtractsRefoundMycenean"=>}}
- {{< cite= "Broad1915Phenomenalism"=>}}
- {{< cite= "Broad1915CriticalNoticeBertrand"=>}}
- {{< cite= "Broad1915ReviewLindemannTrs"=>}}
- {{< cite= "Broad1915WhatWeMean"=>}}
- {{< cite= "Broad1915CriticalNoticeRobb"=>}}
- {{< cite= "Broad1915ReviewProceedingsAristotelian"=>}}
1916 {#1916}
- {{< cite= "Broad1916PreventionWar"=>}}
- {{< cite= "Broad1916ReviewMoreLimitations"=>}}
- {{< cite= "Broad1916ReviewMachScience"=>}}
- {{< cite= "Broad1916ReviewGeorgCantor"=>}}
- {{< cite= "Broad1916FunctionFalseHypotheses"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 43–62"=>}}
- {{< cite= "Broad1916ReviewProceedingsAristotelian"=>}}
- {{< cite= "Broad1916NoteConnotationDenotation"=>}}
- {{< cite= "Broad1916ReviewJohnstonIntroduction"=>}}
- {{< cite= "Broad1916NatureGeometrySpace"=>}}
1917 {#1917}
- {{< cite= "Broad1917HumeTheoryCredibility"=>}}
- {{< cite= "Broad1917CriticalNoticeGeorge"=>}}
- {{< cite= "Broad1917ReviewDavidEugene"=>}}
- {{< cite= "Broad1917ReviewRichardsonLandis"=>}}
- {{< cite= "Broad1917LordKilsbyImpossible"=>}}
1918 {#1918}
- {{< cite= "Broad1918BodyMind"=>}}
- {{< cite= "Broad1918CriticalNoticeProceedings"=>}}
- {{< cite= "Broad1918DukedomHampshire"=>}}
- {{< cite= "Broad1918GeneralNotationLogic"=>}}
- {{< cite= "Broad1918CriticalNoticeProceedings"=>}}
- {{< cite= "Broad1918DegradationEnergie"=>}}
- {{< cite= "Broad1918RelationInductionProbability"=>}}
- {{< cite= "Broad1918WhatSenseSurvival"=>}}
- {{< cite= "Broad1918CriticalNoticeBertrand"=>}}
- {{< cite= "Broad1918Note"=>}}
1919 {#1919}
- {{< cite= "Broad1919MechanicalExplanationIts"=>}}
- {{< cite= "DawesHicks1919SymposiumThereKnowledge"=>}}
- {{< cite= "Broad1919AntecedentProbabilitySurvival"=>}}
- {{< cite= "Broad1919CriticalNoticeErnest"=>}}
- {{< cite= "Broad1919ReviewBechhoferReckitt"=>}}
- {{< cite= "Broad1919ReviewJourdainPhilosophy"=>}}
- {{< cite= "Broad1919NotionGeneralWill"=>}}
- {{< cite= "Broad1919Reality"=>}}
1920 {#1920}
- {{< cite= "Broad1920RelationInductionProbability"=>}}
- {{< cite= "Broad1920ReviewWhiteheadInquiry"=>}}
- {{< cite= "Broad1920CriticalNoticeLossky"=>}}
- {{< cite= "Broad1920RomanceNewJerusalem"=>}}
- {{< cite= "Broad1920EuclidNewtonEinstein"=>}}
- {{< cite= "Broad1920CriticalNoticeWhitehead"=>}}
- {{< cite= "Broad1920ReviewAristotelianSociety"=>}}
- {{< cite= "Broad1920EuclidNewtonEinsteina"=>}}
- {{< cite= "Broad1920CriticalNoticeBernard"=>}}
- {{< cite= "Broad1920PhilosophicalAspectTheory"=>}}
1921 {#1921}
- {{< cite= "Broad1921ProfAlexanderGifford"=>}}
- {{< cite= "Broad1921ReviewErwinFreundlich"=>}}
- {{< cite= "Broad1921ReviewWhiteheadConcept"=>}}
- {{< cite= "Broad1921CharacterCognitiveActs"=>}}
- {{< cite= "Broad1921ProfAlexanderGifforda"=>}}
- {{< cite= "Broad1921ReviewBroseTr"=>}}
- {{< cite= "Broad1921ReviewTaggartNature"=>}}
- {{< cite= "Broad1921ReviewClerkMaxwell"=>}}
- {{< cite= "Broad1921ReviewFlorianCajori"=>}}
- {{< cite= "Broad1921ExternalWorld"=>}}
- {{< cite= "Broad1921ReviewTaggartNature"=>}}
- {{< cite= "Broad1921ReviewCunninghamRelativity"=>}}
- {{< cite= "Broad1921ReviewRobbAbsolute"=>}}
- {{< cite= "Broad1921Time"=>}}
1922 {#1922}
- {{< cite= "Broad1922CriticalNoticeKeynes"=>}}
- {{< cite= "Broad1922ReplyBosanquetProf"=>}}
- {{< cite= "Broad1922NeglectedMethodPsychical"=>}}
- {{< cite= "Broad1922CriticalNoticeJohnson"=>}}
1923 {#1923}
- {{< cite= "Broad1923ScientificThought"=>}}
- {{< cite= "Broad1923Correction"=>}}
- {{< cite= "Broad1923VariousMeaningsTerm"=>}}
- {{< cite= "Broad1923CriticalNoticeWhitehead"=>}}
- {{< cite= "Broad1923ButlerTheologian"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 202–219"=>}}
- {{< cite= "Broad1923ButlerMoralist"=>}}
- {{< cite= "Broad1923ReviewBoscovichTheoria"=>}}
1924 {#1924}
- {{< cite= "Broad1924SymposiumCriticalRealism"=>}}
- {{< cite= "Broad1924MrJohnsonLogical"=>}}
- {{< cite= "Broad1924MrJohnsonLogical2"=>}}
- {{< cite= "Broad1924CriticalSpeculativePhilosophy"=>}}
1925 {#1925}
- {{< cite= "Broad1925MindItsPlace"=>}}
- {{< cite= "Broad1925LateDrMcTaggart"=>}}
- {{< cite= "Broad1925LateDrMcTaggarta"=>}}
- {{< cite= "Broad1925GuideTrinityCollege"=>}}
- {{< cite= "Broad1925ReviewEmileMeyerson"=>}}
1926 {#1926}
- {{< cite= "Broad1926PhilosophyFrancisBacon"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 117–143"=>}}
- {{< cite= "Broad1926SymposiumValidityBelief"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 159–174"=>}}
- {{< cite= "Broad1926NecromanticTripos"=>}}. Reprinted in {{< cite= "Walmsley2022CDBroad"= "pp.= 360–364"=>}}
1927 {#1927}
- {{< cite= "Broad1927EditorPreface"=>}}
- {{< cite= "Broad1927SirIsaacNewton"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 3–28"=>}}
- {{< cite= "Broad1927JohnMcTaggartEllis"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 70–93"=>}}
- {{< cite= "Broad1928PrinciplesProblematicInduction"=>}}
- {{< cite= "Broad1927InterviewsFamousMen"=>}}
1928 {#1928}
- {{< cite= "Broad1928ReviewBertrandRussell"=>}}
- {{< cite= "Broad1928SymposiumTimeChange"=>}}
- {{< cite= "Broad1928AnalysisEthicalConcepts"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 63–81"=>}}
1929 {#1929}
- {{< cite= "Broad1929CriticalNoticeTennant"=>}}
- {{< cite= "Broad1929ReviewJohnstonStruthers"=>}}
1930 {#1930}
- {{< cite= "Broad1930FiveTypesEthical"=>}}
- {{< cite= "Broad1930DogmasReligion"=>}}
- {{< cite= "Broad1930PrinciplesDemonstrativeInduction"=>}}
- {{< cite= "Broad1930CriticalNoticeEwing"=>}}
- {{< cite= "Broad1930PrinciplesDemonstrativeInduction"=>}}
- {{< cite= "Broad1930CriticalNoticeTennant"=>}}
1931 {#1931}
- {{< cite= "Broad1931WarThoughtsPeacetime"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 247–281"=>}}
- {{< cite= "Broad1931ReviewStoutStudies"=>}}
- {{< cite= "Broad1931ReviewWrightMiracle"=>}}
- {{< cite= "Broad1931IndeterminacyIndeterminism"=>}}
- {{< cite= "Broad1931ReviewTaylorFaith"=>}}
- {{< cite= "Broad1931WilliamErnestJohnson"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 94–114"=>}}
- {{< cite= "Broad1932McTaggartPrincipleDissimilarity"=>}}
1932 {#1932}
- {{< cite= "Broad1931CriticalNoticeStout"=>}}
- {{< cite= "Broad1932ReviewLowesDickinson"=>}}
1933 {#1933}
- {{< cite= "Broad1933ExaminationMcTaggartPhilosophy"=>}}
- {{< cite= "Broad1933JohnLocke"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 29–48"=>}}
- {{< cite= "Broad1933ProfHallettAeternitas"=>}}
- {{< cite= "Broad1933ProfHallettAeternitas"=>}}
- {{< cite= "Broad1933ReviewBrailsfordRobertson"=>}}
1934 {#1934}
- {{< cite= "Broad1934DeterminismIndeterminismLibertarianism"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 195–217"=>}} and in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 82–105"=>}}
- {{< cite= "Broad1934GoodnessNameSimple"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 106–123"=>}}
1935 {#1935}
- {{< cite= "Broad1935CritcalNoticeKeeling"=>}}
- {{< cite= "Broad1935MrDunneTheory"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 68–85"=>}}
- {{< cite= "Broad1935MechanicalTeleologicalCausation"=>}}
- {{< cite= "Broad1935NormalCognitionClairvoyance"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 27–67"=>}}
- {{< cite= "Broad1935ReviewMcTEllis"=>}}
1936 {#1936}
- {{< cite= "Broad1936OughtWeFight"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 232–243"=>}} and in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 124–135"=>}}
- {{< cite= "Broad1936AreThereSynthetic"=>}}
- {{< cite= "Broad1937OstensiblyPrecognitiveDream"=>}}
- {{< cite= "Broad1937LetterHonEditor"=>}}
1937 {#1937}
- {{< cite= "Broad1937PhilosophicalImplicationsForeknowledge"=>}}
- {{< cite= "Broad1937CriticalNoticeMises"=>}}
- {{< cite= "Broad1937McTaggartJohnMcTaggart"=>}}
1938 {#1938}
- {{< cite= "Broad1938ExaminationMcTaggartPhilosophy"=>}}
- {{< cite= "Broad1938ReviewStebbingPhilosophy"=>}}.
- {{< cite= "Broad1938HenrySidgwick"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 49–69"=>}}
- {{< cite= "Broad1938SciencePsychicalPhenomena"=>}}
- {{< cite= "Broad1938HenrySidgwickPsychical"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 86–115"=>}}
- {{< cite= "Broad1938SerialismImmortality"=>}}
1939 {#1939}
- {{< cite= "Broad1939ArgumentsExistenceGod"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 175–189"=>}}
- {{< cite= "Broad1939ArgumentsExistenceGoda"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 189–201"=>}}
- {{< cite= "Broad1939PresentRelationsScience"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 220–243"=>}}
1940 {#1940}
- {{< cite= "Broad1940JohnAlbertChadwick"=>}}
- {{< cite= "Broad1940ConscienceConscientiousAction"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 244–262"=>}} and in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 136–155"=>}}
- {{< cite= "Broad1940CriticalNoticeDavid"=>}}
- {{< cite= "Broad1940IntroductionWhatelyCarington"=>}}
- {{< cite= "Broad1940PhysicalAnalogy"=>}}
- {{< cite= "Broad1940ReviewSirArthur"=>}}
1941 {#1941}
- {{< cite= "Broad1941ReviewSamuelAlexander"=>}}
- {{< cite= "Broad1941ReviewJohnLaird"=>}}
- {{< cite= "Broad1940ReviewHardyMathematician"=>}}
1942 {#1942}
- {{< cite= "Broad1942KantTheoryMathematical"=>}}
- {{< cite= "Broad1942BerkeleyArgumentMaterial"=>}}
- {{< cite= "Broad1942RelationsScienceEthics"=>}}
- {{< cite= "Broad1942CertainFeaturesMoore"=>}}
- {{< cite= "Broad1942ReviewPaulArthur"=>}}
1943 {#1943}
- {{< cite= "Broad1943MrSaltmarsh"=>}}
1944 {#1944}
- {{< cite= "Broad1944HrWrightLogic"=>}}
- {{< cite= "Broad1944HrWrightLogica"=>}}
- {{< cite= "Broad1944HrWrightLogicb"=>}}
- {{< cite= "Broad1944CriticalNoticeJulian"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 156–187"=>}}
- {{< cite= "Broad1944ExperimentalEstablishmentTelepathic"=>}}
- {{< cite= "Broad1944StebbingMemorialFund"=>}}
- {{< cite= "Broad1944CaseApparentlyPrecognitive"=>}}
- {{< cite= "Broad1944NewPhilosophyBruno"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 144–167"=>}}
1945 {#1945}
- {{< cite= "Broad1945ReflectionsMoralsenseTheories"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 188–222"=>}}
- {{< cite= "Broad1945ProfessorStout"=>}}
1946 {#1946}
- {{< cite= "Broad1946SpinozaDoctrineHuman"=>}}
- {{< cite= "Broad1942LeibnizsLastControversy"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 168–191"=>}}
- {{< cite= "Broad1946ReviewTaylorDoes"=>}}
- {{< cite= "Broad1946DiscussionProfRhine"=>}}
- {{< cite= "Broad1946MainProblemsEthics"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 223–246"=>}}
1947 {#1947}
- {{< cite= "Broad1947ProfessorMarcWogauTheorie"=>}}
- {{< cite= "Broad1947ProfessorMarcWogauTheorie"=>}}
- {{< cite= "Broad1947PhilosophicalImplicationsPrecognition"=>}}
- {{< cite= "Broad1949RelevancePsychicalResearch"=>}}
- {{< cite= "Broad1947MethodsSpeculativePhilosophy"=>}}
- {{< cite= "Broad1947CriticalNoticePaul"=>}}
- {{< cite= "Broad1947ReviewBertrandRussell"=>}}
1948 {#1948}
- {{< cite= "Broad1948ProgramNextTen"=>}}
- {{< cite= "Broad1947AlfredNorthWhitehead"=>}}
- {{< cite= "Broad1948IanGallie"=>}}
- {{< cite= "Broad1948ReviewToksvigEmanuel"=>}}
1949 {#1949}
- {{< cite= "Broad1949ReviewBrownMetaphysical"=>}}
- {{< cite= "Broad1949LeibnizPredicateinNotionPrinciple"=>}}
- {{< cite= "Broad1949Telepathy"=>}}
- {{< cite= "Broad1949RelevancePsychicalResearch"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 7–26"=>}}
- {{< cite= "Broad1950ReviewWhatelyCarington"=>}}
- {{< cite= "Broad1949DrKeynes"=>}}
1950 {#1950}
- {{< cite= "Broad1950DrSoalForskning"=>}}
- {{< cite= "Broad1950EgoismTheoryHuman"=>}}. Reprinted in {{< cite= "Broad1952EthicsHistoryPhilosophy"= "pp.= 218–231"=>}} and in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 247–261"=>}}
- {{< cite= "Broad1950CriticalNoticeWm"=>}}
- {{< cite= "Broad1950ReviewPatonTr"=>}}
- {{< cite= "Broad1950CommonFallaciesPolitical"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 282–297"=>}}
- {{< cite= "Broad1950SomeTrinityPhilosophers"=>}}
- {{< cite= "Broad1950DrKeynes18521949"=>}}
- {{< cite= "Broad1950ImmanuelKantPsychical"=>}}. Reprinted in {{< cite= "Broad1953ReligionPhilosophyPsychical"= "pp.= 116–155"=>}}
- {{< cite= "Broad1950ReviewArthurPrior"=>}}
- {{< cite= "Broad1950ReviewWhatelyCarington"=>}}
- {{< cite= "Broad1950ImperativesCategoricalHypothetical"=>}}
1951 {#1951}
- {{< cite= "Broad1951HagerstromAccountSense"=>}}
- {{< cite= "Broad1951LogisticAnalysisTwofold"=>}}
- {{< cite= "Broad1951LockeDoctrineSubstantial"=>}}
1952 {#1952}
- {{< cite= "Broad1952EthicsHistoryPhilosophy"=>}}
- {{< cite= "Broad1952ElementaryReflexionsSenseperception"=>}}
- {{< cite= "Broad1952CriticalNoticeToulmin"=>}}
- {{< cite= "Broad1952ReviewMoncrieffClairvoyanta"=>}}
1953 {#1953}
- {{< cite= "Broad1953ReligionPhilosophyPsychical"=>}}
- {{< cite= "Hagerstrom1953InquiriesNatureLaw"=>}}
- {{< cite= "Broad1953TranslatorPreface"=>}}
- {{< cite= "Broad1953ReviewBjorkhemDet"=>}}
- {{< cite= "Broad1953PhantasmsLivingDead"=>}}
- {{< cite= "Broad1953BerkeleyTheoryMorals"=>}}
1954 {#1954}
- {{< cite= "Broad1954LetterEditor"=>}}
- {{< cite= "Broad1954BerkeleyDenialMaterial"=>}}
- {{< cite= "Broad1954CriticalNotePrice"=>}}
- {{< cite= "Broad1954SynopsesPapers"=>}}
- {{< cite= "Broad1955KantMathematicalAntinomies"=>}}
- {{< cite= "Broad1954EmotionSentiment"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 283–301"=>}}
1955 {#1955}
- {{< cite= "Broad1955HumanPersonalityPossibility"=>}}
- {{< cite= "Broad1955PhenomenologyMrsLeonard"=>}}
1956 {#1956}
- {{< cite= "Broad1956EndBorleyRectory"=>}}
- {{< cite= "Broad1956ReviewLuceSense"=>}}
- {{< cite= "Broad1956HalfcenturyPsychicalResearch"=>}}
1957 {#1957}
- {{< cite= "Broad1957CorrespondenceHeavenHell"=>}}
- {{< cite= "Broad1957LocalHistoricalBackground"=>}}
- {{< cite= "Broad1956EraitaTuomasAkvinolaisen"=>}}
- {{< cite= "Broad1957DrTennant"=>}}
1958 {#1958}
- {{< cite= "Broad1958PersonalIdentitySurvival"=>}}
- {{< cite= "Broad1958Philosophy"=>}}
- {{< cite= "Broad1958ReviewMauriceCranston"=>}}
- {{< cite= "Broad1958GEMoore"=>}}
- {{< cite= "Broad1958FredericRobertTennant"=>}}
- {{< cite= "Broad1958HomosexualActs"=>}}
1959 {#1959}
- {{< cite= "Broad1959DreamingSomeImplications"=>}}
- {{< cite= "Broad1959ReviewNormanMalcolm"=>}}
- {{< cite= "Broad1959Autobiography"=>}}
- {{< cite= "Broad1959ReplyMyCritics"=>}}. Partially reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 302–323"=>}}
- {{< cite= "Broad1959BaconExperimentalMethod"=>}}
1961 {#1961}
- {{< cite= "Broad1961MooreLatestPublished"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 324–350"=>}}
- {{< cite= "Broad1961HumesDoctrineOf"=>}}
- {{< cite= "Broad1961PhysicalityPsi"=>}}
1962 {#1962}
- {{< cite= "Broad1962LecturesPsychicalResearch"=>}}
- {{< cite= "Broad1962WittgensteinViennaCircle"=>}}
- {{< cite= "Broad1962ProblemPrecognition"=>}}
1964 {#1964}
- {{< cite= "Broad1964ObligationsUltimateDerived"=>}}. Revised version of {{< cite= "Broad1950ImperativesCategoricalHypothetical"=>}}. Reprinted in {{< cite= "Broad1971BroadsCriticalEssays"= "pp.= 351–368"=>}}
- {{< cite= "Broad1964MemoirAxelHagerstrom"=>}}
1967 {#1967}
- {{< cite= "Broad1967PersonalImpressionsRussell"=>}}
- {{< cite= "Broad1967RemarksSenseperception"=>}}
- {{< cite= "Broad1967NotionPrecognition"=>}}. Reprinted in {{< cite= "Broad1968NotionPrecognitiona"=>}}
1968 {#1968}
- {{< cite= "Broad1968InductionProbabilityAnd"=>}}
- {{< cite= "Broad1968ReviewAndersWedberg"=>}}
- {{< cite= "Broad1968BertrandRussellFirst"=>}}
1970 {#1970}
- {{< cite= "Broad1970Foreword"=>}}
1971 {#1971}
- {{< cite= "Broad1971BroadsCriticalEssays"=>}}
- {{< cite= "Broad1971Preface"=>}}
- {{< cite= "Broad1971SelfOthers"=>}}
1973 {#1973}
- {{< cite= "Broad1973BertrandRussellPhilosopher"=>}}
1975 {#1975}
- {{< cite= "Broad1975LeibnizIntroduction"=>}}
1978 {#1978}
- {{< cite= "Broad1978KantIntroduction"=>}}
1985 {#1985}
- {{< cite= "Broad1985Ethics"=>}}
2022 {#2022}
- {{< cite= "Walmsley2022CDBroad"=>}}
_With thanks to Gwern, Kenneth Blackwell, and Leonardo Picón._]]></description></item><item><title>'Crucial Considerations and Wise Philanthropy', by Nick Bostrom</title><link>https://stafforini.com/notes/crucial-considerations-and-wise-philanthropy-by-nick-bostrom/</link><pubDate>Fri, 17 Mar 2017 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/crucial-considerations-and-wise-philanthropy-by-nick-bostrom/</guid><description>&lt;![CDATA[On July 9th, 2014, Nick Bostrom gave a talk on 'Crucial Considerations and Wise Philanthropy' ([audio](https://soundcloud.com/gooddoneright/nick-bostrom-crucial-considerations-and-wise-philanthropy)|[slides](https://nickbostrom.com/lectures/crucial.pptx)) at [Good Done Right](/notes/good-done-right/), a conference on effective altruism held at All Souls College, Oxford. I found the talk so valuable that I decided to transcribe it.
---
This talk will build on some of the ideas that Nick Beckstead was [talking about](https://podcastaddict.com/episode/https%3A%2F%2Fwww.buzzsprout.com%2F1755269%2Fepisodes%2F8656185-good-done-right-nick-beckstead-how-can-a-long-run-perspective-help-with-strategic-cause-selection.mp3&podcastRSS=https%3A%2F%2Ffeeds.buzzsprout.com%2F1755269.rss&guid=http%3A%2F%2Fearad.io%2F%3Fp%3D84) before lunch. By contrast with his presentation, though, this will not be a well-presented presentation. This is very much a work in progress, and so there's going to be some jump cuts, and some of the bits will be muddled, etc. But I'll look forward to the discussion part of this.
What is a crucial consideration? {#what-is-a-crucial-consideration}
So I want to talk about this concept of a _crucial consideration_, which comes up in the kind of work that we're doing a lot. Suppose you're out in the forest, and you have a map and a compass, and you're trying to find some destination. You're carrying some weight, maybe you have a lot of water because you need to hydrate yourself to reach your goal and carry weight, and trying to fine-tune the exact direction you're going. You're trying to maybe figure out how much water you can pour out, to lighten your load without having too little to last to the destination.
All of these are normal considerations: you're fine-tuning the way you're going to make more rapid progress towards your goal. But then you look more closely at this compass that you have been using, and you realize that the magnet part has actually come loose. This means that the needle might now be pointing in a completely different direction that bears no relation to North: it might have rotated some unknown number of laps or parts of a lap.
With this discovery, you now completely lose confidence in all the earlier reasoning that was based on trying to get the more accurate reading of where the needle was pointing. This would be an example of a crucial consideration in the context of orienteering. The idea is that there could be similar types of consideration in more important contexts, that throw us off completely what we thought we knew about the overall direction or priority.
So here are two earlier attempts to describe this idea that I had. So a crucial consideration is:
- a consideration such that if it were taken into account it would overturn the conclusions we would otherwise reach about how we should direct our efforts, or
- an idea or argument that might possibly reveal the need not just for some minor course adjustment in our practical endeavors but a major change of direction or priority.
Within a utilitarian context, one can perhaps try to explicate it as follows:
- a crucial consideration is a consideration that radically changes the expected value of pursuing some high-level subgoal.
The idea here is that you have some evaluation standard that is fixed, and you form some overall plan to achieve some high-level subgoal. This is your idea of how to maximize this evaluation standard. A crucial consideration, then, would be a consideration that radically changes the expected value of achieving this subgoal, and we will see some examples of this. Now if you widen the context not limited to some utilitarian context, then you might want to retreat to these earlier more informal formulations, because one of the things that could be questioned is utilitarianism itself. But for most of this talk we will be kind of thinking about that component.
There are some related concepts that are useful to have. So a _crucial consideration component_ will be an argument, idea or some datum which, while not on its own amounting to a crucial consideration, seems to have a substantial probability of maybe being able to serve a central role within a crucial consideration. It's the kind of thing [of which we would say:] “This looks really intriguing, this could be important; I'm not really sure what to make of it at the moment.” On its own maybe it doesn't tell us anything, but maybe there's another piece that, when combined, will suddenly yield an important result. So those kinds of crucial consideration components could be useful to discover.
Then there's the concept of a _deliberation ladder_, which would be a sequence of crucial considerations, regarding the same high-level subgoal, where the considerations jostled you in opposing directions. Let's look at some examples of these kinds of crucial consideration ladders that help to illustrate the general predicament.
Should I vote in the national election? {#should-i-vote-in-the-national-election}
Let's take this question: (A1) “Should I vote in the national election?” At the sort of “level one” of reasoning, you think, “Yes, I should vote to put a better candidate into office.” That clearly makes sense.
Then you reflect some more: (A2) “But, my vote is extremely unlikely to make a difference. I should not vote but put my time to better use.”
(These examples are meant to illustrate the general idea; it's not so much I want a big discussion as to these particular examples, they're kind of complicated. But I think they will serve to illustrate the general phenomenon.)
So, with consideration number two we have gone from “Yes, we should vote. Now that involves making a plan to get to the polling booth,” etc. to “No, I should not vote. I should do something completely different.”
Then you think, (A3) “Well, although it's unlikely that my vote will make a difference, the stakes are very high: millions of lives are affected by the president. So even if the chance that my vote will be decisive is one in several million, the expected benefit is still large enough to be worth a trip to the polling station.” So, all right, I was going to kick back in front of the television, turn on the football game, and now “Oh, well, actually I should vote”, so we've gained reversed direction.
Then you continue to think, (A4) “Well, if the election is not close, then my vote will make no difference. If the election _is_ close, then approximately half of the votes will be for the wrong candidate, implying either that the candidates are of almost exactly the same merit, so it doesn't really matter who wins, or typical voters' judgment of the candidates' merits is _extremely_ unreliable, and carries almost no signal, so I should not bother to vote.”
Now you sink back into the comfy sofa and bring out the popcorn or whatever, and then you think, (A5) “Oh, well, of course I'm a much better judge of the candidates' merits than the typical voter, so I should vote.”
Okay, on with the coat again. Then you think, (A6) “Well, but psychological studies show that people tend to be overconfident: almost everybody believes themselves to be above average, but they are as likely to be wrong as right about that. So If I am as likely to vote for the wrong candidate as is the typical voter, then my vote would have negligible information to the selection process, and I should not vote.”
Then we go on: (A7) “Okay, I've gone through all of this reasoning that really means that I'm special, so I should vote.”
But then, (A8) “Well, if I'm so special, then the opportunity cost is really high. I should do something more important.”
(A9) “But if I don't vote my acquaintances will see that I have failed to support the candidates that we all think are best, they would think me weird and strange, and disloyal, so then that would maybe diminish my influence, which I could otherwise have used for good ends, so I should vote after all.”
(A10) “But it's important to stand up for one's convictions, to stimulate fruitful discussion. They might think I'm like really sophisticated if I explained all this complicated reasoning for voting, and that might increase my influence, which I can then invest in some good cause”, etc.
There is no reason to think that the ladder would stop there; it's just that we run out of steam at this point. If you end up at some point, you might then wonder, maybe there are further steps on the ladder, and how much reason do you really think you have for the completion you're temporarily at, at that stage?
Should we favor more funding for x-risk tech research? {#should-we-favor-more-funding-for-x-risk-tech-research}
I want to look at one other example of a deliberation ladder more in the context of technology policy and X-risk. This is a kind of argument that can be run with regard to certain types of technologies: whether we should try to promote them or get more funding for them.
The X technology here is nanotechnology —this is in fact the example where this line of reasoning originally came up, some parts of this harking back to Eric Drexler's book , where he actually advocated this line of thinking.
(B1) “So we should fund nanotechnology —this is the”level one” reasoning— because there are many potential future applications: medicine, manufacturing, clean energy, etc. It would be really great if we had all those benefits.”
(B2) “But it also looks like nanotechnology could have important military applications, and it could be used by terrorists etc., to create new weapons of mass destruction that could pose a major existential threat. So if it's so dangerous, _no_, maybe we shouldn't really fund it.”
(B3) “But if this kind of technology is possible, it will almost certainly be developed sooner or later, whether or not _we_ decide to pursue it. (‘We’ being maybe the people in this room or the people in Britain or Western democracies.) If responsible people refrain from developing it, then it will be developed by irresponsible people, which would make the risks even greater, so we should fund it.” (And you can see that with regard to nanotechnology, this seems to work, but the same template could be relevant for evaluating other technologies with upsides and downsides.)
(B4) “But _we_ —obviously not the people in this room, but, say, advanced democracies— are already ahead in its development, so extra funding would only get us there sooner, leaving us less time to prepare for the dangers. So we should not add funding: the responsible people can get there first even without adding funding to this endeavor.”
(B5) But then you look around and see virtually no serious effort to prepare for the dangers of nanotechnology, because —and this is basically Drexler's point back in _Engines/— serious preparation will begin only /after_ a massive project is already underway to develop nanotechnology. Only then will people take the prospect seriously. The earlier a serious Manhattan-like project to develop nanotechnology is initiated, the longer it will take to complete, because the earlier you start, the lower the foundation from which you begin. The actual project will then run for longer, and that will then mean more time for preparation: serious preparation only starts when the project starts, and the sooner the project starts, the longer it will take, so the longer the preparation time will be. And that suggests that we should push as hard as we can to get this product launched immediately, to maximize time for preparation.
But then there are more considerations that should be taken into account:
(B6) The level of risk will be affected by factors other than the amount of serious preparation that has been made specifically to counter the threat from nanotechnology. For instance, machine intelligence or ubiquitous surveillance might be developed before nanotechnology, eliminating or mitigating the risks of the latter. Although these other technologies may pose great risks of their own, those risks would have to be faced anyway ---and there's a lot more that can be said: this is against the background, like a discourse about these kinds of things that has been going on--- and nanotechnology would not really reduce these other risks, like the risks from AI, for example. So the preferred sequence is that we get superintelligence or ubiquitous surveillance before nanotechnology, and so we should oppose extra funding for nanotechnology even though superintelligence and ubiquitous surveillance might be very dangerous on their own, including posing existential risks. Given certain background assumptions about the [technological completion conjecture](https://www.nickbostrom.com/papers/future.pdf) ---that in the fullness of time, unless civilization collapses, all possible general useful technologies will be developed--- these dangers will have to be confronted, and all our choice really concerns is the sequence in which we confront these. And it's better to confront superintelligence before nanotechnology because superintelligence can obviate the nanotechnology risk, but not _vice versa_.
(B7) However, if people oppose extra funding for nanotechnology, then people working in nanotechnology will dislike those people who are opposing it. (This is also a point from Drexler's book.) Other scientists might regard these people who oppose funding for nanotechnology as being anti-science and this will reduce our ability to work with these scientists, hampering our efforts on more specific issues —efforts that stand a better chance of making a material difference than any attempts on our part to influence the level of national funding for nanotechnology. So we should not oppose nanotechnology. That is, rather than opposing nanotechnology in an attempt to slow it down a little bit —and we are a small group, we can't make much difference— we should work with the nanotechnology scientists, be their friend, and then maybe try to influence on the margin, so that they develop nanotechnology in a slightly different way or add some safeguards, and stuff like that.
Again, there is no clear reason to think that we have reached the limit of the level of deliberation that we could apply to this. So it's disconcerting because it looks like the practical upshot keeps switching back and forth as we look more deeply into the search tree. And we might wonder what it is about, and I think that these kinds of deliberation ladders are crucial considerations: they seem to be particularly likely to turn up when one is trying to be a thoroughgoing utilitarian, and one really takes these big-picture questions seriously.
Crucial considerations and utilitarianism {#crucial-considerations-and-utilitarianism}
There are some possible reasons for why that might be. If we compare, for example, the domain of application “utilitarianism” to another domain of application, say if you have an ordinary human preference function —you want a flourishing life, a healthy family, a successful career and some relaxation: a typical human's value— if you're trying to satisfy those, it looks less likely that you will encounter a large number of these crucial considerations. Why might that be?
One possible explanation is that we have more knowledge and experience of human life at the personal level. Billions of people have tried to maximize an ordinary human utility function and have received a lot of feedback and a lot of things have been tried out. So we already know some of the basics like, if you want to go on for decades, it's a good idea to eat, things like that. They're not suddenly going to be discovered, right? And maybe our preferences in the first place have been shaped to more or less fit with the kind of opportunities we can cognitively exploit in the environment by evolution. So we might not have some weird preference that there was no way that we could systematically satisfy. Whereas with utilitarianism, the utilitarian preference, as it were, extends far and wide beyond our familiar environment, including into the cosmic commons and billions of years into the future and super advanced civilizations: they do matter from the utilitarian perspective, and matter a lot. Most of what the utilitarian preference cares about is stuff that we have no familiarity with.
Another possible source of crucial considerations with regard to utilitarianism is difficulties in understanding the goal itself that is postulated by utilitarianism. For example, if one tries to think about how to apply utilitarianism to a world that has a finite probability of being infinite, you run into difficulties in terms of how to measure different infinite magnitudes and still seeing how we could possibly make any difference to it. I have a big paper about that,[^fn:1] and we don't need to go into it here. There are some other issues that consist in trying actually to articulate utilitarianism to deal with all these possible cases.
The third possible reason here is that one might think that we are kind of close ---not super close, but close--- to some pivot point of history. That means that we might have special opportunities to influence the long-term future now. And we're still far enough away from this, that it's not obvious what we should do to have the maximally beneficial impact on the future, but still close enough that we can maybe begin to perceive some contours of the apparatus that will shape the future. So, for example, if you think that superintelligence might be this pivot point, or one of them (there may be x-risk pivot points as well that we will confront in this century), then it might just be that we are barely just beginning to get the ability to think about those things, which introduces a whole set of new considerations that might be very important. This could affect the personal domain as well. It's just like with an ordinary person's typical utility function: they probably don't place a million times more value on living for a billion years than living for a hundred years, or a thousand times more value on raising a thousand children than on raising one child. So even though the future still exists, it just doesn't weigh as heavily in a normal human utility function as it does for the utilitarian.
One might also argue that we have recently discovered some key exploration tools that enable us to make these very important discoveries about how to be a good utilitarian, and we haven't yet run the course with these tools, so we keep turning up fundamental new important discoveries using these exploration tools. That's why there seem to be so many crucial considerations being discovered. We might talk a little bit about some of those later in the presentation.
Evaluation functions {#evaluation-functions}
Now let me come at this from a slightly different angle. In chess, the way we would ideally play is you would start by thinking the possible moves that you could make, then the possible responses that your opponent could make, and your responses to those responses. Ideally, you would think that through all the way to the end state, and then just try to select a first move that would be best from the point of view of winning when you could calculate through the entire game tree. But that's computationally infeasible because the tree branches too much: you have an exponential number of moves to consider. So what you instead have to do is to calculate explicitly some number of plies ahead. Maybe a dozen plies ahead or something like that. But at that point, your analysis has to stop, and what you do is to have some evaluation function which is relatively simple to compute, which tries to look at the board state that could result from this sequence of six moves and countermoves, and in some rough and ready way try to estimate how good that state is. A typical chess evaluation function might look something like this:
Evalchess = (c1 × material) + (c2 × mobility) + (c3 × king safety) + (c4 × center control) + ...
You have some term that evaluates how much material we have, like having your queen and a lot of pieces is beneficial, the opponent having few of those is also beneficial ---we have some metric like a pawn is worth one and queen is worth, I don't know, 11 or something like that. So you weigh that up ---that's one component in the evaluation function. Then maybe consider how mobile your pieces are. If they're all crammed in the corner, that's usually an unpromising situation, so you have some term for that. King safety [is another component in the function]. Center control adds a bit of value: if you control the middle of the board, we know from experience that tends to be a good position. So what you do is calculate explicitly some number of steps ahead and then you have this relatively unchanging evaluation function that is used to figure out which of these initial games that you could play would be resulting in the most beneficial situation for you. These evaluation functions are mainly derived from human chess masters who have a lot of experience playing with the game. And the parameters, the weight assigned to these different features, might also be learned by machine intelligence.
We do something analogous to that in other domains. In a typical traditional public policy, social welfare economists might think that you need to maximize some social welfare function which might take a form somewhat like this:
Evalpublic policy = (c1 × GDP) + (c2 × employment) + (c3 × equality) + (c4 × environment) + ...
GDP? Yes, we want more GDP, but we also have to take into account the amount of unemployment, maybe the amount of equality or inequality, some factor for the health of the environment. It might not be that whatever we write there is exactly the thing that is equivalent to moral goodness fundamentally considered. But we know that these things tend to be good, or we think so. This is a useful approximation of true value that might be more tractable in a practical decision-making context. One thing I can ask, then, is if there is something similar to that for moral goodness.
Evalmoral_goodness = ?
You want to do the morally best thing you can do, but to calculate all of these out from scratch just looks difficult or impossible to do in any one situation. You need some kind of more stable principles that you can use to evaluate different things you could do. So here we might look at the more restricted version of utilitarianism and we can wonder what we might put in there.
Evalutilitarian = ?
So here we can hark back to some of the things Beckstead talked about. If we plot capacity, which could be sort of level of economic development and technological sophistication —stuff like that— on one axis and time on the other, my view is that the human condition is a kind of metastable region on this capability axis:
{{< figure= src="/ox-hugo/future-technological-capacity-graph.png" alt="A graph of capacity over time depicting various existential outcomes. Branches lead to extinction, short-term viability, singleton sustainability, or an upper limit labeled cosmic endowment.">}}
You might fluctuate inside for a while, but the longer the time scale you're considering, the greater the chance that you will exit that region in either the downwards direction and go extinct ---we have too few resources below the minimum viable population size and go extinct (and that's one attractor state: once you're extinct, you tend to stay extinct)--- or in the upwards direction: we get through to technological maturity, start a colonization process and the future of earth-originating intelligent life might then just be this bubble that expands at some significant fraction of the speed of light and eventually accesses all the cosmological resources that are in principle accessible from our starting point. So it's a finite quantity: because of the positive cosmological constant, it looks like we can only access a finite amount of stuff. But once you've started that, once you've set off an intergalactic empire, it looks like it could just keep going with high probability to this natural conclusion.
From that perspective we can define this concept of an existential risk as one that fails to realize the potential for realizing value that you could gain by accessing the cosmological commons, either by going extinct or by maybe accessing all the cosmological commons but then failing to use them for beneficial purposes, because your values are corrupted or something like that.
That suggests this MAXIPOK principle that Beckstead also mentioned: “Maximize the probability of an OK outcome”:
arg max [- P(existential catastrophe / action)]
It's clearly, at best, a rule of thumb: it's not meant to be a valid moral principle that's true in all possible situations. It's not that. In fact, if you want to go away from the original principle you started [with] to something practically tractable, I think you have to make it contingent on various empirical assumptions. That's the trade-off there: you want to make as weak assumptions as you can and still move it as far as possible towards being tractable as you can. I think this is something that makes a reasonable compromise there. In other words, take the action that minimizes the integral of existential risk that humanity will confront. It will not always give you the right answer, but it's a starting point. There are different things to the ones that Beckstead mentioned: there could be other scenarios where this would give the wrong answer. If you thought that there was a big risk of some sort of hyper existential catastrophe like some sort of hell scenario, then you might want to increase the level of existential risks slightly in order to decrease the risk that there would not just be an existential catastrophe but hyper existential catastrophe. Other things that could come into it are trajectory changes that are less than drastic and just shift slightly.
For present purposes, we could consider the suggestion of using the Maxipok rule as our attempt to define the value function for utilitarian agents.
Evalutilitarian ≈ MAXIPOK
Then the question becomes, “If you want to minimize existential risk, what should you do?”
EvalMAXIPOK = ?
That is still a very high-level objective. We still need to do more work to break that down into more tangible components.
_[Slide: A conceptual graphic entitled 'Dynamic Sustainability' with axes labelled 'Technology', 'Coordination', and 'Insight', depicting humanity's current position with a rocket, dangerous regions with lightning, and a safe region indicated by a sun.]_
So another little jump cut. I'm not exactly sure how well this fits in with the rest of the presentation —I have this nice slide from another presentation. Maybe it's a different way of saying some of what I just said: that instead of thinking about sustainability as is commonly known, as this static concept that is as a stable state that we should try to approximate, where we use up no more resources than are regenerated by the natural environment, we need, I think, to think about sustainability in dynamical terms, where instead of reaching a state, we try to enter and stay on a trajectory that is indefinitely sustainable in the sense that we can continue to travel on that trajectory indefinitely, and it leads in a good direction.
An analogy here would be if you have a rocket. One stable state for a rocket is on the launch pad: it can stand there for a long time. Another stable state is if it's up in space, it can continue to travel for an even longer time, perhaps, since it doesn't rust and stuff. But in midair, you have this unstable system. I think that's where humanity is now: we're in midair. The static sustainability concept suggests that we should reduce our fuel consumption to the minimum that just enables us to hover there. Thus, maybe prolong the duration in which we could stay in our current situation, but what we perhaps instead should do is maximize the fuel consumption so that we have enough thrust to reach escape velocity. (And that's not a literal argument for burning as much fossil fuels as possible: it's just a metaphor.)
But the point here is that there are these several different axes of seeing that to have a utopia, to have the best possible condition, we need super advanced technology —to be able to access the cosmic commons, to be able to cure all the diseases that plague us, etc. I think to have the best possible world, you'll also need a huge amount of insight and wisdom, and a large amount of coordination so as to avoid using high technology to wage war against one another, and so forth.
EvalMAXIPOK = f (wisdom, coordination, differential tech development, ...)
Ultimately, we would want a state where we have huge quantities of each of these three variables, but that leaves open the question of what we want more from our current situation. It might be, for example, that we would want more coordination and insight before we have more technology of a certain type. So that before we have various powerful technologies, we would first want to make sure that we have enough peace and understanding to not use them for warfare, and that we have enough insight and wisdom not to accidentally blow ourselves up with them. A superintelligence, clearly, seems to be something you want in utopia ---it's a very high level of technology---, but we might want a certain amount of insight before we develop superintelligence, so we can develop it in the correct way. Anyway, one can begin to think about, as in analogy with the computer chess situation, if there are different features that one could possibly think of as components of this evaluation function for the utilitarian (the MAXIPOK). This principle of differential technological development suggests “retard the development of dangerous and harmful technologies ---the ones that raise existential risk, that is--- and accelerate technologies that reduce existential risks”, so that it's just this component. This is our first sketch, it's not a final answer, but one might think we want a lot of wisdom, we want a lot of international peace and cooperation, and with regard to technologies, it gets a little bit more complicated: we want faster progress in some technology areas, perhaps, and slower in others. I think those are three broad kinds of things one might want to put into one's evaluation function.
Cause selection vs. signature determination {#cause-selection-vs.-signature-determination}
This suggests that one thing to be thinking about in addition to interventions or causes, is the signature of different kinds of things. So an intervention should be sort of high leverage, and a cause area should promise high leverage interventions. It's not enough that something you could do would do good, you also want to think hard about how much good it could do relative to other things you could do. There is no point in thinking about causes without thinking about how you see all the low-hanging fruits that you could access. So a lot of the thinking is about that. But when we're moving at this more elevated plane, this high altitude where there are these crucial considerations, then it also seems to become valuable to think about determining the sign of different basic parameters, maybe even when we are not sure how we could affect them ---the sign being, basically, “Do we want more or less of it?”. We might initially bracket questions as to leverage here, because to first orient ourselves in the landscape we might want to postpone that question a little bit in this context. But a good signpost ---that is a good parameter of which we would like to determine the signature--- would have to be visible from afar. That is, if we define some quantity in terms that still make it very difficult for any particular intervention to say whether it contributes positively or negatively to this quantity that we just defined, then it's not so useful as a signpost. So, “maximize expected value”, say, that is the quantity we could define. It just doesn't help us very much, because whenever you try to do something specific you're still virtually as far away as you are. But on the other hand, if you set some more concrete objective, like “maximize the number of people in this room”, or something like that, we can now easily tell like how many people there are, we have ideas of how we could maximize it, so any particular action we think of we might easily see how it bears on this objective of maximizing the people in this room. However, we might feel it's very difficult to get strong reasons for knowing whether more people in this room is better, or whether there is presumably some inverse U curve there. A good signpost should strike a reasonable compromise between being visible from afar and also being such that we can have strong reason to be sure of its sign.
Some tentative signposts {#some-tentative-signposts}
Here are some very tentative signposts, and they're tentative in my own view, and I guess there might also be a lot of disagreement among different people, so these are more areas for investigation. But it might be useful just to show how one might begin to think about it.
_Do we want faster progress in computer hardware or slower progress?_ My best guess there is that we want slower progress. And that has to do with the risks from the machine intelligence transition. Faster computers would make it easier to make AI, which (a) would make them happen sooner probably, which seems perhaps bad in itself because it leaves less time for the relevant kind of preparation, of which there is a great need; and (b) might reduce the skill level that would be required to produce AI. So with a ridiculously large amount of computing power you might be able to produce AI without really knowing much about what you're doing. When you are hardware-constrained you might need more insight and understanding, and it's better that AI be created by people who have more insight and understanding.
This is not by any means a knockdown argument, because there are other existential risks. If you thought that we are about to go extinct anytime soon, because somebody will develop nanotechnology, then you might want to try the AI wildcard as soon as possible. All things considered this is my current best guess, but these are the kinds of reasoning that one can engage in.
_Whole brain emulation?_ We did a long, big analysis of that.[^fn:2] More specifically, not whether we want to have whole brain emulation, but whether we want to have more or less funding for whole brain emulation, more or less resources for developing that. This is one possible path towards machine superintelligence, and for complicated reasons, my guess is “No”, but that's even more uncertain, and we have a lot of different views in our research group on that. (In the discussion, if anybody is interested in one particular one, we can zoom in on that.)
_Biological cognitive enhancement of humans?_ My best guess there is that we want faster progress in that area.
I talk more about these three in the book and also about artificial intelligence. _Artificial intelligence?_ I think we want AI probably to happen a little bit slower than it's likely to do by default.
Another question is: _If there is one company or project or team that will develop the first successful AI, how much ahead does one want that team to be to the second team that is trying to do it?_ My best guess is that we want it to have a lot of lead, many years ideally, to enable them to slow down at the end to implement more safety measures, rather than being in the tight tech race.
_Solutions to the control problem for AI?_ I think we want faster progress in that, and that's one of our focus areas, and some of our friends from the [Machine Intelligence Research Institute](https://intelligence.org/) are here, also working hard on that. Let's move away from the AI domain.
_The effective altruism movement?_ I think that looks very good in many ways, robustly good, to have faster, better growth in that.
_International peace and cooperation?_ Looks good.
_Synthetic biology?_ I think it looks bad. We haven't thought as carefully about that, so that could change, but it looks like there could be x-risks from that, although [it looks] also beneficial. Insofar as it might enable improvements in cognitive enhancement, there'll be a kind of difficult trade-off.
_Nanotechnology?_ I think it looks bad: we want slower progress towards that.
_Economic growth?_ Very difficult to tell the sign of that, in my view. And within a community of people that have thought hard about that there are, again, different guesses as to the sign of that.
_Small and medium-scale catastrophe prevention?_ Global catastrophic risks falling short of existential risk, it's very difficult to know the sign of that. Here we are bracketing leverage at all: even just knowing whether we would want more or less, if we could get it for free, it's non-obvious. On the one hand, small-scale catastrophes might create an immune response that makes us better, puts in place better safeguards, and stuff like that, that could protect against really big stuff. If we're thinking about medium-scale catastrophes that could cause civilizational collapse, large by ordinary standards but only medium-scale in comparison to existential catastrophes, which are large in this context, again, [it is] not totally obvious what the sign of that is: there's a lot more work to be done to try to figure that out. If recovery looks very likely, you might then have guesses as to whether the recovered civilization would be more likely to avoid existential catastrophe having gone through this experience or not.
A lot more work is needed, but these are the parameters that one can begin to think about. One doesn't realize just how difficult it is: even some parameters that from an ordinary common-sense point of view seem kind of obvious, actually turn out to be quite non-obvious once you start to think through the way that they're all supposed to fit together. Suppose you're an administrator here in Oxford, you're working in the Computer Science department, and you're the secretary there. Suppose you find some way to make the department run slightly more efficiently: you create this mailing list so that everybody can, when they have an announcement to make, just email it to the mailing list rather than having to put in each person individually in the address field. And that's a useful thing ---that's a great thing: it didn't cost anything, other than a one-off cost, and now everybody can go about their business more easily. From this perspective, it's very non-obvious whether that is, in fact, a good thing. It might be contributing to AI: that might be the main effect of this, other than the very small general effect on economic growth, which is questionable. And it might probably be that you have made the world worse in expectation by making this little efficiency improvement. So this project of trying to think through this it's in a sense a little bit like the Nietzschean _[Umwertung aller Werte](https://en.wikipedia.org/wiki/Transvaluation_of_values)_ (the revaluation of all values) project that he never had a chance to complete, because he went mad before he could start.
Possible areas with additional crucial considerations {#possible-areas-with-additional-crucial-considerations}
- Counterfactual trade
- Simulation stuff
- Infinite paralysis
- Pascalian muggings
- Different kinds of aggregative ethics (total, average, negative)
- Information hazards
- Aliens
- Baby universes
- Other kinds of moral uncertainty
- Other game theory stuff
- Pessimistic metainduction; epistemic humility; anthropics
- Insects, subroutines
So, these are some kinds of areas —I'm not going to go into all of these, I'm just giving examples of the kinds of areas where today it looks like there might still be crucial considerations. This is not an exhaustive list by any means, and we can talk more about some of those. They kind of go from more general and abstract and powerful, to more specific and understandable by ordinary reasoning.
To just pick an example: _insects_. If you are a classical utilitarian, this consideration arises within the more mundane —so we're setting aside these cosmological commons and just thinking about here on Earth. If insects are sentient then maybe the amount of sentience in insects is very large because there are so very, very many of them. So that maybe the effect of our policies on insect well-being might trump the effect of our policies on human well-being or animals in factories and stuff like that. I'm not saying it does, but it's a question that is non-obvious and that could have a big impact.
_Subroutines_. With certain kinds of machine intelligence there are processes, like reinforcement learning algorithms and other subprocesses within the AI. Maybe some of those could turn out to have moral status in some way. Maybe there will be hugely large numbers of runs of these subprocesses, so that if it turns out that some of these kinds of things count for something, then maybe the numbers again would come to dominate. Each of these is a whole workshop on its own, so it's not something we can go into.
Some partial remedies {#some-partial-remedies}
So what can one do if one suspects that there might be these crucial considerations, some of them not yet discovered? I don't have a crisp answer to that. Here are some _prima facie_ plausible things one might try to do a little bit of:
- _Don't act precipitously, particularly in ways that are irrevocable._
- _Invest in more analysis to find and assemble missing crucial considerations._ That's why I'm doing the kind of work that I'm doing, and the rest of us are also involved in that enterprise.
- _Take into account that expected value changes are probably smaller than they appear when thinking about this big._ If you are a utilitarian, let's say you think of this new argument that has this radical implication for what you should be doing, the first instinct might be to radically change your expected utility of different practical policies in light of this new insight. But maybe when you reflect on the fact that there are new crucial considerations being discovered every once in a while, maybe you should still change your expected value, but not as much as it seems you should at first sight. You should reflect on this at the meta level.
- _Use parliamentary/mixed models._ If we widen our purview to not just consider utilitarianism, as we should consider things from a more general unrestricted normative perspective, then it looks like something like the Parliamentary Model[^fn:3] for taking normative uncertainty into account looks fairly robust. This is the idea that if you are unsure as to which moral theory is true, then you should assign probabilities to different moral theories and imagine that there were a kind of parliament where each moral theory got to send delegates to that parliament in proportion to their probability. Then in this imaginary parliament, these delegates from the different moral theories discuss and compromise and work out what to do. And then you should do what that moral parliament of yours would have decided, as a sort of metaphor. The idea is that, other things equal, the more probability a moral theory has, the greater its say in determining your actions, but there might also be these trades between different moral theories which I think Toby talked about in [his presentation](https://soundcloud.com/gooddoneright/toby-ord-moral-trade). This is one metaphor for how to conceive of those traits. It might not be exactly the right way to think about fundamental normative uncertainty, but it seems to be close in many situations, and it seems to be relatively robust in the sense of being unlikely to have a totally crazy implication.
- _Focus more on near-term and convenient objectives._ To the extent that one is despairing about having any coherent view about how to go about maximizing aggregative welfare in this cosmological context, the greater it seems the effective voice of other types of things that one might be placing weight on. So if you're partly an egoist and partly an altruist, then if you say that the altruistic component is on this kind of deliberation ladder then maybe you should go more with the egoistic part, until and unless you can find stability in your altruistic deliberations.
- And then this general idea of maybe _focus on developing our capacity as a civilization to wisely deliberate on these types of things_: to build up our capacity, rather than pursuing very specific goals —and by capacity in this context it looks like perhaps we should focus less on powers and more on the propensity to use powers well. This is still quite vague, but something in that general direction seems to be robustly desirable. Certainly, you could have a crucial consideration that's turned up to show that that was the wrong thing to do, but it still looks like a reasonable guess.
That's it. Thanks.
[^fn:1]: {{< cite= "Bostrom2016InfiniteEthics"=>}}
[^fn:2]: {{< cite= "Sandberg2008WholeBrainEmulation"=>}}
[^fn:3]: {{< cite= "Bostrom2009MoralUncertaintySolution"=>}}]]></description></item><item><title>Ben Kuhn on the effective altruist movement</title><link>https://stafforini.com/notes/ben-kuhn-on-the-effective-altruist-movement/</link><pubDate>Wed, 23 Jul 2014 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/ben-kuhn-on-the-effective-altruist-movement/</guid><description>&lt;![CDATA[Ben Kuhn is a data scientist and engineer at a small financial technology firm. He previously studied mathematics and computer science at Harvard, where he was also co-president of [Harvard College Effective Altruism](http://harvardea.org/). He writes on effective altruism and other topics at [his website](http://www.benkuhn.net/).
---
**Pablo**: How did you become involved in the EA movement?
**Ben**: When I was a sophomore in high school (that's age 15 for non-Americans), Peter Singer gave his [_The Life You Can Save_](http://www.thelifeyoucansave.org/) talk at [my high school](http://commschool.org/). He went through his whole "child drowning in the pond" spiel and explained that we were morally obligated to give money to charities that helped those who were worse off than us. In particular, I think at that point he was recommending donating to Oxfam in a sort of Kantian way where you gave an amount of money such that if everyone gave the same percentage it would eliminate world poverty. My friends and I realized that there was no utilitarian reason to stop at that amount of money--you should just donate everything that you didn't need to survive.
So, being not only sophomores but also sophomoric, we decided that since Prof. Singer didn't live in a cardboard box and wear only burlap sacks, he must be a hypocrite and therefore not worth paying attention to.
Sometime in the intervening two years I ran across Yvain's essay [_Efficient Charity: Do Unto Others_](http://lesswrong.com/lw/3gj/efficient_charity_do_unto_others/) and through it [GiveWell](http://www.givewell.org/). I think that was the point where I started to realize Singer might have been onto something. By my senior year (ages 17-18) I at least professed to believe pretty strongly in some version of effective altruism, although I think I hadn't heard of the term yet. I wrote [an essay](https://s3.amazonaws.com/bknet/on_charity.pdf) on the subject in a publication that my writing class put together. It was anonymous (under the brilliant _nom de plume_ of "Jenny Ross") but somehow my classmates all figured out it was me.
The next big update happened during the spring of my first year of Harvard, when I started going to the Cambridge Less Wrong meetups and met [Jeff](http://www.jefftk.com/index) and [Julia](http://www.givinggladly.com/). Through some chain of events they set me up with the folks who were then running Harvard High-Impact Philanthropy (which later became [Harvard Effective Altruism](http://harvardea.org/)). After that spring, almost everyone else involved in HHIP left and I ended up becoming president. At that point I guess I counted as "involved in the EA movement", although things were still touch-and-go for a while until John Sturm came onto the scene and made HHIP get its act together and actually do things.
**Pablo**: In spite of being generally sympathetic to EA ideas, you have recently written a thorough [critique of effective altruism](http://www.benkuhn.net/ea-critique).  I'd like to ask you a few questions about some of the objections you raise in that critical essay.  First, you have drawn a distinction between pretending to try and actually trying.  Can you tell us what you mean by this, and why do you claim that a lot of effective altruism can be summarized as “pretending to actually try”?
**Ben**: I'm not sure I can explain better than what I wrote in that post, but I'll try to expand on it. For reference, here's the excerpt that you referred to:
By way of clarification, consider a distinction between two senses of the word “trying”.... Let's call them “actually trying” and “pretending to try”. Pretending to try to improve the world is something like responding to social pressure to improve the world by querying your brain for a thing which improves the world, taking the first search result and rolling with it. For example, for a while I thought that I would try to improve the world by developing computerized methods of checking informally-written proofs, thus allowing more scalable teaching of higher math, democratizing education, etc. Coincidentally, computer programming and higher math happened to be the two things that I was best at. This is pretending to try. Actually trying is looking at the things that improve the world, figuring out which one maximizes utility, and then doing that thing. For instance, I now run an effective altruist student organization at Harvard because I realized that even though I'm a comparatively bad leader and don't enjoy it very much, it's still very high-impact if I work hard enough at it. This isn't to say that I'm actually trying yet, but I've gotten closer.
Most people say they want to improve the world. Some of them say this because they actually want to improve the world, and some of them say this because they want to be perceived as the kind of person who wants to improve the world. Of course, in reality, everyone is motivated by other people's perceptions to some extent--the only question is by how much, and how closely other people are watching. But to simplify things let's divide the world up into those two categories, "altruists" and "signalers."
If you're a signaler, what are you going to do? If you don't try to improve the world at all, people will notice that you're a hypocrite. On the other hand, improving the world takes lots of resources that you'd prefer to spend on other goals if possible. But fortunately, looking like you're improving the world is easier than actually improving the world. Since people usually don't do a lot of due diligence, the kind of improvements that signallers make tend to be ones with very good appearances and surface characteristics--like [PlayPumps](http://www.pbs.org/frontlineworld/stories/southernafrica904/video_index.html), water-pumping merry-go-rounds which initially appeared to be a clever and elegant way to solve the problem of water shortage in developing countries. PlayPumps got tons of money and celebrity endorsements, and their creators got lots of social rewards, even though the pumps turned out to be hideously expensive, massively inefficient, prone to breaking down, and basically a disaster in every way.
So in this oversimplified world, the EA observation that "charities vary in effectiveness by orders of magnitude" is explained by "charities" actually being two different things: one group optimizing for looking cool, and one group optimizing for actually doing good. A large part of effective altruism is realizing that signaling-charities ("pretending to try") often don't do very much good compared to altruist-charities.
(In reality, of course, everyone is driven by some amount of signalling and some amount of altruism, so these groups overlap substantially. And there are other motivations for running a charity, like being able to convince _yourself_ that you're doing good. So it gets messier, but I think the vastly oversimplified model above is a good illustration of where my point is coming from.)
Okay, so let's move to the second paragraph of the post you referenced:
Using this distinction between pretending and actually trying, I would summarize a lot of effective altruism as “pretending to actually try”. As a social group, effective altruists have successfully noticed the pretending/actually-trying distinction. But they seem to have stopped there, assuming that knowing the difference between fake trying and actually trying translates into ability to actually try. Empirically, it most certainly doesn't. A lot of effective altruists still end up satisficing---finding actions that are on their face acceptable under core EA standards and then picking those which seem appealing because of other essentially random factors. This is more likely to converge on good actions than what society does by default, because the principles are better than society's default principles. Nevertheless, it fails to make much progress over what is directly obvious from the core EA principles. As a result, although “doing effective altruism” feels like truth-seeking, it often ends up being just a more credible way to pretend to try.
The observation I'm making here is roughly that EA seems not to have switched entirely to doing good for altruistic rather than signaling reasons. It's more like we've switched to _signaling_ that we're doing good for altruistic rather than signaling reasons. In other words, the motivation didn't switch from "looking good to outsiders" to "actually being good"--it switched from "looking good to outsiders" to "looking good to the EA movement."
Now, the EA movement is way better than random outsiders at distinguishing between things with good surface characteristics and things that are actually helpful, so the latter criterion is much stricter than the former, and probably leads to much more good being done per dollar. (For instance, I doubt the EA community would ever endorse something like PlayPumps.) But, at least at the time of writing that post, I saw a lot of behavior that seemed to be based on finding something pleasant and with good surface appearances rather than finding the thing that optimized utility--for instance, donating to causes without a particularly good case that they were better than saving or picking career options that seemed decent-but-not-great from an EA perspective. That's the source of the phrase "pretending to actually try"--the signaling isn't going away, it's just moving up a level in the hierarchy, to signaling that you don't care about signaling.
Looking back on that piece, I think “pretending to actually try” is still a problem, but my intuition is now that it's probably not huge in the scheme of things. I'm not quite sure why that is, but here are some arguments against it being very bad that have occurred to me:
- It's probably somewhat less prevalent than I initially thought, because the EAs making weird-seeming decisions may be doing them for reasons that aren't transparent to me and that get left out by the typical EA analysis. The typical EA analysis tends to be a 50000-foot average-case argument that can easily be invalidated by particular personal factors.
- As Katja Grace [points out](http://meteuphoric.wordpress.com/2013/12/22/pretend-to-really-really-try/), encouraging pretending to really try might be optimal from a movement-building perspective, inasmuch as it's somewhat inescapable and still leads to pretty good results.
- I probably overestimated the extent to which motivated/socially-pressured life choices are bad, for a couple reasons. I discounted the benefit of having people do a diversity of things, even if the way they came to be doing those things wasn't purely rational. I also discounted the cost of doing something EA tells you to do instead of something you also want to do.
- For instance, suppose for the sake of argument that there's a pretty strong EA case that politics isn't very good (I know this isn't actually true). It's probably good for marginal EAs to be dissuaded from going into politics by this, but I think it would still be bad for every single EA to be dissuaded from going into politics, for two reasons. First, the arguments against politics might turn out to be wrong, and having a few people in politics hedges against that case. Second, it's much easier to excel at something you're motivated at, and the category of "people who are excellent at what they do" is probably as important to the EA movement as "people doing job X" for most X.
I also just haven't noticed as much pretending-stuff going on in the last few months, so maybe we're just getting better at avoiding it (or maybe I'm getting worse at noticing it). Anyway, I still definitely think there's pretending-to-actually-try going on, but I don't think it's a huge problem.
**Pablo**: In another section of that critique, you express surprise at the fact that so many effective altruists donate to global health causes now.  Why would you expect EAs to use their money in other ways--whether it's donating now to other causes, or donating later--, and what explains, in your opinion, this focus on causes for which we have relatively good data?
**Ben**; I'm no longer sure enough of where people's donations are going to say with certainty that too much is going to global health. My update here is from of a combination of being overconfident when I wrote the piece, and what looks like an increase in waiting to donate shortly after I wrote it. The latter was probably due in large part to [AMF's delisting](http://blog.givewell.org/2013/11/26/change-in-against-malaria-foundation-recommendation-status-room-for-more-funding-related/) and perhaps the precedent set by [GiveWell employees](http://blog.givewell.org/2013/12/12/staff-members-personal-donations/), many of whom waited last year (though [others argued against it](http://blog.givewell.org/2013/12/31/some-considerations-against-saving-for-next-year/)). (Incidentally, I'm excited about the projects going on to make this more transparent, e.g. the questions on the survey about giving!)
The giving now vs. later debate has been [ably summarized by Julia Wise](http://www.effective-altruism.com/giving-now-vs-later-summary/) on the EA blog. My sense from reading various arguments for both sides is that I more often see bad arguments for giving now. There are definitely good arguments for giving at least some money now, but on balance I suspect I'd like to see more saving. Again, though, I don't have a great idea of what people's donation behavior actually is; my samples could easily be biased.
I think my strongest impression right now is that I suspect we should be exploring more different ways to use our donations. For instance, some people who are earning to give have experimented with funding people to do independent research, which was a pretty cool idea. Off the top of my head, some other things we could try include scholarships, essay contest prizes, career assistance for other EAs, etc. In general it seems like there are tons of ways to use money to improve the world, many of which haven't been explored by GiveWell or other evaluators and many of which don't even fall in the category of things they care about (because they're too small or too early-stage or something), but we should still be able to do something about them.
**Pablo**: In the concluding section of your essay, you propose that _self-awareness_ be added to the list of principles that define effective altruism. Any thoughts on how to make the EA movement more self-aware?
**Ben**: One thing that I like to do is think about what our blind spots are. I think it's pretty easy to look at all the stuff that is obviously a bad idea from an EA point of view, and think that our main problem is getting people "on board" (or even "getting people to admit they're wrong") so that they stop doing obviously bad ideas. And that's certainly helpful, but we also have a ways to go just in terms of figuring things out.
For instance, here's my current list of blind spots--areas where I wish there were a lot more thinking and idea-spreading going on then there currently is:
- **Being a good community.** The EA community is already having occasional growing pains, and this is only going to get worse as we gain steam e.g. with Will MacAskill's upcoming book. And beyond that, I think that ways of making groups more effective (as opposed to individuals) have a lot of promise for making the movement better at what we do. Many, many intellectual groups fail to accomplish their goals for basically silly reasons, while seemingly much worse groups do much better on this dimension. It seems like there's no intrinsic reason we should be worse than, say, Mormons at building an effective community, but we're clearly not there yet. I think there's absolutely huge value in getting better at this, yet almost no one putting in a serious concerted effort.
- **Knowing history.** Probably as a result of EA's roots in math/philosophy, my impression is that our average level of historical informedness is pretty low, and that this makes us miss some important pattern-matches and cues. For instance, I think a better knowledge of history could help us think about capacity-building interventions, policy advocacy, and community building.
- **Fostering more intellectual diversity.** Again because of the math/philosophy/utilitarianism thing, we have a massive problem with intellectual monoculture. Of my friends, the ones I enjoy talking about altruism the most with now are largely actually the ones who associate least with the broader EA community, because they have more interesting and novel perspectives.\*
- **Finding individual effective opportunities**. I suspect that there's a lot of room for good EA opportunities that GiveWell hasn't picked up on because they're specific to a few people at a particular time. Some interesting stuff has been done in this vein in the past, like funding small EA-related experiments, funding people to do independent secondary research, or giving loans to other EAs investing in themselves (at least I believe this has been done). But I'm not sure if most people are adequately on the lookout for this kind of opportunity.
(Since it's not fair to say "we need more X" without specifying how we get it, I should probably also include at least one anti-blind spots that I think we should be spending fewer resources on, on the margin: Object-level donations to e.g. global health causes. I feel like we may be hitting diminishing returns here. Probably donating some is important for signalling reasons, but I think it doesn't have a very high naive expected value right now.)
**Pablo**: Finally, what are your plans for the mid-term future?  What EA-relevant activities will you engage in over the next few years, and what sort of impact do you expect to have?
**Ben**: A while ago I did some reflecting and realized that most of the things I did that I was most happy about were pretty much unplanned--they happened not because I carefully thought things through and decided that they were the best way to achieve some goal, but because they intuitively seemed like a cool thing to do. (Things in this category include starting a blog, getting involved in the EA/rationality communities, running Harvard Effective Altruism, getting my current job, etc.) As a result, I don't really have "plans for the mid-term future" per se. Instead, I typically make decisions based on intuitions/heuristics about what will lead to the best opportunities later on, without precisely knowing (or even knowing at all, often) what form those opportunities will take.
So I can't tell you what I'll be doing for the next few years--only that it will probably follow some of my general intuitions and heuristics:
- **Do lots of things**. The more things I do, the more I increase my "luck surface area" to find awesome opportunities.
- **Do a few things really well**. The point of this heuristic is hopefully obvious.
- **Do things that other people aren't doing**--or more accurately, things that not enough people are doing relative to how useful or important they are. My effort is most likely to make a difference in an area that is relatively under-resourced.
I'd like to take a moment here to plug the [conference call on altruistic career choice](http://www.givewell.org/altruistic-career-choice) that Holden Karnofsky of GiveWell had, which makes some great specific points along these lines.
Anyway, that's my long-winded answer to the first part of this question. As far as EA-relevant activities and impacts, all the same caveats apply as above, but I can at least go over some things I'm currently interested in:
- Now that I'm employed full-time, I need to start thinking much harder about where exactly I want to give: both what causes seem best, and which interventions within those causes. I actually currently don't have much of a view on what I would do with more unrestricted funds.
- Related to the point above about self-awareness, I'm interested in learning some more EA-relevant history--how previous social movements have worked out, how well various capacity-building interventions have worked, more about policy and the various systems that philanthropy comes into contact with, etc.
- I'm interested to see to what extent the success of Harvard Effective Altruism can be sustained at Harvard and replicated at other universities.
I also have some more speculative/gestational interests--I'm keeping my eye on these, but don't even have concrete next steps in mind:
- I think there may be under-investment in healthy EA community dynamics, preventing common failure modes like unfriendliness, ossification to new ideas, groupthink etc.--though I can't say for sure because I don't have a great big-picture perspective of the EA community.
- I'm also interested in generally adding more intellectual/epistemic diversity to EA--we have something of a monoculture problem right now. Anecdotally, there are a number of people who I think would have a really awesome perspective on many problems that we face, but who get turned off of the community for one reason or another.]]></description></item><item><title>The Gift</title><link>https://stafforini.com/notes/the-gift/</link><pubDate>Fri, 24 May 2013 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/the-gift/</guid><description>&lt;![CDATA[by Ian Parker
_The New Yorker_, vol. 80, no. 21 (August 2, 2004), pp. 54-63
Last summer, not long after Zell Kravinsky had given almost his entire forty-five-million-dollar real-estate fortune to charity, he called Barry Katz, an old friend in Connecticut, and asked for help with an alibi. Would Katz call Kravinsky's wife, Emily, in Philadelphia, and say that the two men were about to take a weeklong trip to Katz's ski condominium in Vermont? This untruth would help Kravinsky do something that did not have his wife's approval: he would be able to leave home, check into the Albert Einstein Medical Center, in Philadelphia, for a few days, and donate a kidney to a woman whose name he had only just learned.
Katz refused, and Kravinsky became agitated. He said that the intended recipient of his gift would die without the kidney, and that his wife's reluctance to support this "nondirected" donation-it would be only the hundred and thirty-fourth of its kind in the United States-would make her culpable in that death. "I can't allow her to take this person's life!" Kravinsky said. He was, at forty-eight, a former owner of shopping malls and distribution centers, and a man with a single thrift-store suit that had cost him twenty dollars.
"You think she'd be taking a life?" Katz asked.
"Absolutely," Kravinsky replied.
Katz then asked, warily, "Do you mean that anybody who is not donating a kidney is taking someone's life?"
"Yes," Kravinsky said.
"So, by your terms, I'm a murderer?"
"Yes," Kravinsky said, in as friendly a way as possible.
After a pause, Katz said, "I have to get off the phone-I can't talk about this anymore," and he hung up. A few weeks later, Kravinsky crept out of his house at six o'clock in the morning while his wife and children were still asleep. Emily Kravinsky learned that her husband had donated a kidney when she read about it in a local newspaper.
Kravinsky, whose unrestrained disbursement of his assets-first financial, then corporeal-has sometimes been unsettling for the people close to him, grew up in a row house in the working-class Philadelphia neighborhood of Oxford Circle, amid revolutionary rhetoric. "My father would say how great things were in the Soviet Union, and how shabby they were here," Kravinsky recalled recently. "He would rail against rich people and the ruling class."
Kravinsky's father, Irving, who is now eighty-nine, was born in Russia to a Jewish family, which immigrated to America when he was a boy. A tank commander in the Second World War, he was a socialist whose faith in the Soviet Union was extinguished only after that country no longer existed. He worked as a printer, Kravinsky told me, "thinking he'd be in the vanguard of the revolution by remaining in the proletariat"; and when Zell, who had two older sisters, began to excel in school his success seems to have been taken by his father as a sign of class disloyalty. After Zell graduated from elementary school with a prize as the best student, Irving told him, "Well, next year you'll be nothing."
James Kahn, a childhood friend of Kravinsky's, and a fellow-member of the chess team in high school, told me that Zell's father and mother-Reeda Kravinsky is a former teaching supervisor, now seventy-eight-"were steadfast in denying him any praise." He added, "I think what he did later was almost in desperation-doing the most extreme thing possible, something that they couldn't deny was a good thing." Reeda told me, "I think we did praise him, but maybe he didn't get enough attention, for an outstanding child."
As a boy, Kravinsky could hope to gain his parents' attention either by conforming or by rebelling; he did both. "Zell was simultaneously more left-wing and more right-wing than I was," Kahn said. He had an active social conscience-he read books on Gandhi, and, at the age of twelve, he picketed City Hall in support of public housing. (He remembers this as the last time he did anything that met with his father's approval.) But, by the standards of the late sixties, Kravinsky was unfashionably curious about money. He first invested in the stock market when he was twelve, and told me that he was "pretty young when I understood money better than my father did."
In Kravinsky's eyes, his father had humiliated himself in his relationships with money. Citing his radical politics, Irving Kravinsky said he couldn't apply for union work. "He was terrifically exploited," Kravinsky recalled. "He was afraid to ask for a raise. My mother yelled at him, day and night, said he wasn't a man, and 'Zell's more of a man than you are.' "
In 1971, Kravinsky won a scholarship to Dartmouth. He majored in Asian studies, wrote poetry, took up meditation, and grew his hair long. Soon after graduation, Kravinsky returned to Philadelphia, where he got a job at an insurance company. He began a relationship with a co-worker there, and moved in with her; the match lasted less than a year, but it had the side effect of introducing Kravinsky to real estate. He bought a duplex in the working-class neighborhood of Logan for ten thousand dollars, and rented out half of it. When the couple split up, Kravinsky kept the apartment, then sold it for a two-thousand-dollar profit.
As Kravinsky acquired a taste for property, he looked for ways to satisfy his idealistic self-in which good intentions were mixed with habits of self-criticism and a preemptive resentment about being ridiculed or undervalued. In 1978, he began to work with socially and emotionally troubled students in Philadelphia's public schools. "I became a teacher in the ghetto," Kravinsky recalls. "Everyone I went to college with laughed at that. I was written off as a failure."
The job offered moral satisfaction, but it also depressed Kravinsky-his pride in self-sacrifice counterbalanced by the thought that he was being taken for a ride. (Once, after school, he took a promising student to the theatre and, as he walked the boy home, he was mugged in a way that made Kravinsky think he might have been set up.) He grew more involved in real estate: he bought a condo, then a house in Maine; his deals became grander, and he began to see profits of tens of thousands of dollars. "Nobody in my family had ever made that much money," he said. He spent very sparingly, preferring to reinvest; by 1982, he owned a three-story building near the University of Pennsylvania campus, but he lived in the smallest, gloomiest apartment, with no shower, kitchen, or windows.
Barry Katz, who met Kravinsky around this time, and who is now a developer of luxury homes in Connecticut, found him to be brilliant and articulate, "with the kind of intensity you don't encounter in many people. He also had a lost-puppy quality." Kravinsky had skipped a year in high school and one in college, and, according to Edward Miller, another old friend, who is now a lecturer in English literature, his intellectual and emotional maturity seemed out of step. "You could call it high-school-geek syndrome," Miller said.
In 1984, Kravinsky was devastated by the death of Adria, the elder of his two sisters, from lung cancer. She was thirty-three; Zell was thirty. "She was the only person in my family who liked me in any meaningful way," Kravinsky said, describing the guilt he still feels for not showing her enough affection, and for not persuading her to quit smoking. "We were close, but there were so many things that kept me from spending more time with her. I wish I could go back." Kravinsky entered a period of deep depression. He shared a house with Miller, who remembers that Kravinsky mostly stayed in his room, writing poetry on a typewriter. Kravinsky stopped teaching in 1986, and he gave two of his three properties to his surviving sister, Hilary, and sold the other.
It was a despairing time, but it jolted Kravinsky out of the life of the self-abnegating schoolteacher. He expanded his intellectual ambitions, completing a Ph.D. in composition theory at Penn's School of Education. (His unusual dissertation proposed a "table of rhetorical elements," which was inspired by the periodic table.) He also took courses at the New School, in New York, and at the School of Criticism and Theory, at Dartmouth; in 1990, he began a second Ph.D. at Penn, with a dissertation, "Paradise Glossed," that dissected the rhetoric of Milton with mathematical rigor. At Penn, he started teaching undergraduate courses in Renaissance literature, and met and married Emily Finkelstein, a doctor who is now a psychiatrist with an expertise in eating disorders. Kravinsky became a resident adviser, and the couple lived frugally in student housing. ("Free rent, free meals-the greatest deal in the world," Kravinsky recalled.) They had the first of four children in 1991.
Kravinsky's Milton dissertation was "an intense close reading and quite wonderful," according to Maureen Quilligan, then the graduate chairperson of Penn's English department and now a professor at Duke. "It's one of the best I've ever read. It sounded like deconstruction, although he'd got there without having to do any deconstruction theory." After it was finished, Kravinsky taught an undergraduate Milton course at Penn that Quilligan describes as "fantastically successful-the kids responded to it with the wildest enthusiasm, and they worked hard for him and had a sublime intellectual experience." At the end of each lecture, Kravinsky would stand at the door and shake hands with every student. "He said he was hunting for another Milton," Quilligan remembers.
Though he was admired by students-and had impeccable leftist credentials-he was galled to find that his intellectual interests were considered insufficiently avant-garde by academe. As Kravinsky saw it, "What they didn't like was that Milton was the great classical liberal. Classical liberalism, bourgeois liberalism-they felt the same way about it as my father." Quilligan says that he was handicapped by "the eccentricity of his intellectual and spiritual intensity, added to the fact that he had written about a single white male author." Kravinsky recalls going to job interviews carrying letters of recommendation from scholars as distinguished as Stanley Fish, "and at every one they said, 'You have a spectacular portfolio, both of your Ph.D.s are relevant, Fish said you "can do anything"-but we're looking for diversity.' " Only the University of Helsinki offered him a job.
By 1994, he had decided to give up on an academic career. Instead, he would make a living in real estate. Kravinsky said that his wife was skeptical. "She said I'd become a bum," he told me. But, thanks to his earlier real-estate record, and his evident mathematical brilliance, Kravinsky was able to persuade the United Valley Bank to lend him two million dollars, with which he bought two apartment buildings-around a hundred and fifty thousand square feet in total-one near Penn, the other near St. Joseph's University. Kravinsky knew that in a recession people will go back to school, and that the ratio of rent to property prices will be highest where a university is in a run-down urban area. He was also fearless about being highly leveraged.
Kravinsky was improvising-"Nobody ever taught me how to succeed, or took me under their wing"-but his portfolio quickly grew, and within a year he had assets of six million dollars and debts of four million. Though he was now wealthy, he spent no more than he had before, with the exception of a hundred-and-thirty-thousand-dollar house that he bought in Jenkintown, a Philadelphia suburb, in 1995. (His second child had just been born.) "There was little of the mogul apparent to the eye," Barry Katz said. Even to his close associates, Kravinsky's business seemed implausible. Edward Miller took a job with him as an apartment manager but was never convinced that the property empire was real. "I didn't fully believe it," he told me. "I thought that somehow it was a deck of cards." These mistaken thoughts were reinforced by seeing "the most disorganized, chaotic organization you can imagine-leases at the bottom of closets, under the toilets, soaking wet." Miller was also surprised to see how blithely neglectful Kravinsky could sometimes be of contractors and janitors, as if he were grateful for the chance to take a vacation from the patient, solicitous persona he showed to his friends.
Property management ultimately did not suit Kravinsky-" 'Tenants and toilets'; there's a phrase that suggests the agony," he said-and in 1998 he began selling most of his rental properties (now about four hundred apartments) and turning to commercial real estate, investing at a level where the building is a mere premise for an intricate dance of numbers. "Everything else can change, but numbers remain the same; numbers are your best friends," Kravinsky said. "I needed to leverage my intellect, return to math."
Kravinsky bought supermarkets and warehouses; that is, he looked for tenants with good credit ratings and with long leases, then paid for the buildings with loans bundled into bonds by Wall Street banks and sold to institutional investors. These loans have a singular advantage: if things go wrong, nobody comes for your stereo. In 1999, in a typical deal, Kravinsky bought a clothing-distribution center in Ohio for $16.8 million. He put up $1.1 million and borrowed $15.7 million. If the building decreased in value by a hundred per cent, he would lose $1.1 million; if it increased in value by a hundred per cent, he would make $16.8 million.
"Most people think the more you borrow the riskier it is," Kravinsky has said. "In my system, the more you borrow the safer it is." (On a single day in April, 1999, he borrowed thirty-two million dollars. He remembers Emily asking, "How much do we have to pay on that?" It was around ten thousand dollars a day. She said, dryly, "Well, if worst comes to worst, I can just treat a hundred people a day.") Kravinsky made full use of the tax advantages of commercial-real-estate investments: in the eyes of the I.R.S., a shopping mall depreciates in value, like an office chair, and one can set that depreciation against income tax, overlooking the fact that a mall, over time, is likely to increase in value.
Kravinsky knew how to make money, but he had no talent for spending it. His investments were an expression of his intellect-they were splendid rhetorical gestures, and to take money out for, say, a swimming pool would be to lose the debate. Even as he became rich, he was arguing at home against buying two minivans to replace a 1985 Toyota Camry. (He eventually gave in, and lost the Camry, which has since become an object of regret and longing.) The children did not get pocket money, and Emily had to fight to have the front porch repaired. ("Emily was certainly complicit in the family's frugality, but she became frustrated by Zell's refusal to spend money," a friend of the Kravinskys' told me.) Kravinsky worked from home. He recalled how one well-dressed man came to interview for an accountant's job and, seeing Kravinsky's modest home and casual dress, ran away. Kravinsky watched him disappear down the street and called out, "Where are you going?" The interviewee shouted, "I don't believe you," and kept running.
About three years ago, as Kravinsky's assets rose to nearly forty-five million dollars-a million square feet of commercial real estate, along with lofts, houses, and condos-friends began to hear him talk of giving all his assets to charity. He had long entertained philanthropic thoughts, although, as Katz told me, "I don't think it ever occurred to Zell that the by-product of what he was doing would be wealth on this scale." In 1998, Kravinsky had tried to donate some properties and empty lots to the University of Pennsylvania. He says that the university was wary of him, and "didn't even take me out to lunch." As his portfolio grew, however, Kravinsky's charitable impulse became more urgent. Edward Miller remembers sitting at his dining table one night with Kravinsky and James Kahn, "and Zell began to talk of giving away his wealth. And we said, 'Don't do it.' " Kahn asked him why he didn't give away a third of his fortune, and use the rest to become richer, and ultimately give even more money away. As Miller recalled, "We berated him for three or four hours. We said, 'You're depressed.' He seemed like King Lear, dividing his kingdom so he could 'unburdened crawl toward death.' "
For the moment, Kravinsky's friends prevailed. "I think he wanted to be talked out of it," Miller said. But Kravinsky, the skilled rhetorician, seems to have discovered something unanswerable in his own rhetoric. "The reasons for giving a little are the reasons for giving a lot, and the reasons for giving a lot are the reasons for giving more," he recently said. Kravinsky feared that he might lose his assets, or his impulse to give, or that his wife would challenge the idea. Emily was philanthropically inclined, but, as Kravinsky recalled it, he needed to "walk her into the idea" of total divestment-gift by gift, keeping the emphasis on public health, which attracted her, and promising that quitting real estate would bring him closer to the family. "I said I'd have more time for the kids," he told me. "She thought it was crazy to give everything away, but she said, 'At least we'll be out of the business.' " The gifts were made with her blessing and in her name. "My impression was that she decided she didn't want to be made out to be a Scrooge," a friend of the Kravinskys' told me.
In 2002, Zell and Emily gave an eighty-seven-thousand-square-foot apartment building to a school for the disabled in Philadelphia. The same year, they gave two gifts, worth $6.2 million, to the Centers for Disease Control Foundation. The gifts were partly in the form of a distribution center, four condominiums, three houses, and a parking lot; Kravinsky placed them in a fund named for his late sister, Adria. In March, 2003, the Kravinskys created the Adria Kravinsky Foundation, to support a School of Public Health at Ohio State University; the gift included three warehouses, four department stores, and a shopping center in Indianapolis. Together, these were worth around thirty million dollars. Karen Holbrook, the president of O.S.U., called the gift "a magnificent commitment."
Kravinsky had put some money aside-he had established trust funds for his wife, his children, and the children of his surviving sister. But his personal assets were now reduced to a house (on which he had a large mortgage), two minivans, and about eighty thousand dollars in stocks and cash. According to Katz, "He gave away the money because he had it and there were people who needed it. But it changed his way of looking at himself. He decided the purpose of his life was to give away things."
Jenkintown, Pennsylvania, is a mixed-income community of about four thousand people which tries to maintain a small-town character within the sprawl of housing developments and shopping malls just north of Philadelphia. I made my first visit to Kravinsky in November, parking in front of a wooden-shingled house with a broken photocopier on the front porch and a tangle of bicycles, tricycles, and wagons. A handwritten sign by the door, a marker of spousal frustration, read, "Put Your Keys Away Before You Forget."
Kravinsky came to the door several minutes after I rang the bell. He is slight, and looked both boyish and wan, with pale, almost translucent skin. He wore sneakers, a blue plaid shirt, and tan trousers with an elasticized waist. He seemed distracted, and I realized later that the timing of my visit was awkward: he knew that his wife would not want a reporter in the house, but she had gone out, and two of his four young children were home, so he could not immediately go out to lunch with me.
He invited me into a house crowded with stuff, including a treadmill in the middle of the living room. He cleared away enough books and toys for me to sit down on a sofa. His daughter, who is nine, came into the room to say hello, but when Emily Kravinsky came home, a moment later, she walked straight past us into the kitchen, taking the girl with her. Kravinsky followed. He came back after a few minutes and picked up his coat, and as we left the house he said, "She wants us out of here."
We drove to a restaurant in a nearby mini-mall. He ordered a mushroom sandwich and a cup of warm water that he didn't touch. "I used to feel that I had to be good, truly good in my heart and spirit, in order to do good," he said, in a soft voice. "But it's the other way around: if you do good, you _/ become better. With each thing I've given away, I've been /more_ certain of the need to give more away. And at the end of it maybe I will be good. But what are they going to say-that I'm depressed? I am, but this isn't suicidal. I'm depressed because I haven't done enough."
Within a few minutes, Kravinsky had talked of Aristotle, Nietzsche, and the Talmud, and, in less approving terms, of the actor Billy Crudup, who had just left his pregnant girlfriend for another woman. ("How do you like that!") Kravinsky's mostly elevated range of reference, along with a rhetorical formality and a confessional tone, sometimes gave the impression that he was reading from his collected letters. "What I aspire to is ethical ecstasy," he said. "_Ex stasis:_ standing out of myself, where I'd lose my punishing ego. It's tremendously burdensome to me." Once achieved, "the significant locus would be in the sphere of others."
His cell phone rang, and a mental switch was flicked: "You have to do a ten-thirty-one and put fresh money in on terms that are just as leveraged . . . going eight per cent over debt. . . . I think we should do it. It's nice to start with a blue chip."
These contrasting discourses have one clear point of contact. In our conversations, Kravinsky showed an almost rhapsodic appreciation of ratios. In short, ratios are dependable and life is not. "No number is significant in itself: its only significance is in relation to other numbers," he said. "I try to rely on relationships between numbers, because those relationships are constant-unlike Billy Crudup and the woman he impregnated. Even if the other relationships in our lives are going to hell in a handbasket, numbers continue to cooperate with one another."
In the months following the first of Kravinsky's financial gifts, a new ratio began to preoccupy him: the one-in-four-thousand chance that a person has of dying in an operation to donate a kidney. In early 2003, he read an article in the _Wall Street Journal_ that introduced him to the idea of nondirected kidney donations, in which an altruistic-minded person gives an organ to benefit a stranger-someone in the pool of sixty thousand people on America's kidney-transplant waiting list. The demand for kidneys outstrips the supply; the buying and selling of organs is illegal, and although there are between fifteen and twenty thousand deaths in America each year that could yield organs, about half of families deny permission for the bodies of their relatives to be used in this way, often disregarding the dead person's donor card. Kravinsky was so struck by the article that he cut it out and kept it in a desk drawer.
The notion of nondirected organ donation is not new. Joseph E. Murray, who directed the first successful human kidney-transplant operation, in 1954, in Boston, recently recalled that, by that time, he had received three offers of kidneys-from a prisoner, a homeless man, and a nun. They could not be accepted; early transplants were generally between identical twins, for a precise biological match. But in the early sixties advances in immunosuppressant drugs allowed surgeons to begin transplanting from deceased donors to unrelated recipients and from living donors other than twins-typically, blood relatives. By 1963, there were no medical barriers to nondirected donation. But while kidney transplants became almost routine-last year, there were sixty-five hundred living-donor and eighty-seven hundred deceased-donor transplants in America-nondirected donation did not.
On occasion, altruists engaged in a somewhat less radical practice, donating kidneys to people they had not met but whose plights had attracted their attention (say, through a newspaper article). But doctors were resistant even to this idea, and questioned the sanity of these donors; according to a paper published in _Seminars in Psychiatry_ in 1971, the practice was viewed by most physicians as "impulsive, suspect, and repugnant." Doctors were also under the impression, now revised, that related donations were almost always better than unrelated ones. In addition, a kidney-removal operation was initially far more painful and invasive than it later became; until the mid-nineties, it was often necessary to break the donor's rib, and the donor was frequently left with a long scar.
In the late nineties, by coincidence, two donors independently approached two hospitals with a request to make a nondirected kidney donation, and neither hospital could think of a good reason for turning them away. Joyce Roush, a transplant nurse from Indiana, introduced herself to Lloyd Ratner, a leading transplant surgeon then at Johns Hopkins, in Baltimore. "I was quite skeptical," Ratner told me. "I said, 'Give me a call and we'll consider,' thinking she'd never call me. She called and called." And at the University of Minnesota a would-be donor with a long record of altruistic acts made the same request, saying, "I want to do this and go home and be happy." Following bioethical consultation, and psychiatric testing of the donors, the hospitals accepted the offers: in Minnesota, the anonymous donor's kidney was transplanted in August, 1999; a few weeks later, at Johns Hopkins, Joyce Roush donated one to a thirteen-year-old boy. Now, several dozen nondirected donations are performed each year in the U.S.
Kravinsky considered the risks. Although Richard Herrick, who received the first kidney transplant, died eight years later, Ronald Herrick, his donor and twin brother, is still alive. As Herrick's example suggests, and medical research confirms, there are no health disadvantages to living with one kidney. One is enough-it grows a little bigger-and the notion that a spare should be packed for emergencies is misconceived: nearly all kidney disease affects both.
The risks are in the operation. "I had a one-in-four-thousand chance of dying," Kravinsky told me. "But my recipient had a certain death facing her." To Kravinsky, this was straightforward: "I'd be valuing my life at four thousand times hers if I let consideration of mortality sway me."
He made one other calculation: there was a chance that one of his four children-then aged between three and eleven-might need a kidney that only he could supply. Kravinsky took into account the rarity of childhood kidney disease, the fact that he had only ten or so years left as a viable donor, and the fact that siblings tend to be the best kidney matches-his children were well provided with siblings. He decided that the risk was no greater than one in two hundred and fifty thousand, and that it was a risk he could accept. In fact, Kravinsky began to think of a donation as "a treat to myself. I really thought of it as something pleasurable."
In a now famous 1972 essay, "Famine, Affluence and Morality," the Australian philosopher Peter Singer set up the ethical puzzle that has become known as the Shallow Pond and the Envelope. In the first case, a child has fallen into a shallow pond and is drowning; Singer considers saving the child, and reflects on the inconvenience of muddy clothes. In the second, he is asked by the Bengal Relief Fund to send a donation to save the lives of children overseas.
To ignore the child in the pond would be despicable, most people would agree; to ignore an envelope from a charity would not be. (And the law supports that view.) But Singer's contention was that the two derelictions are ethically alike. "If we can prevent something bad without sacrificing anything of comparable significance, we ought to do it," he has written. To allow harm is to do harm; it is wrong not to give money that you do not need for the most basic necessities.
Many philosophers disagree-and would argue, in one way or another, that we can have greater faith in our intuitive moral judgments. Colin McGinn, a philosopher at Rutgers, has called Singer's principle "positively bad, morally speaking," for "it encourages a way of life in which many important values are sacrificed to generalized altruism" and devalues "spending one's energies on things other than helping suffering people in distant lands. . . . Just think of how much the human race would have lost if Newton and Darwin and Leonardo and Socrates had spent their time on charitable acts!" Singer has his adherents: in 1996, Peter Unger, a philosopher at New York University, published "Living High and Letting Die," an extension of Singer's analysis whose aim was to show how we let ourselves off the ethical hook too easily. According to Unger, we placate our consciences with an "illusion of innocence."
By the spring of 2003, Zell Kravinsky had become a man with no such illusion. "It seems to me crystal clear that I should be giving all my money away and donating all of my time and energy," Kravinsky said, and he speculated that failure to be this generous was corrosive, in a way that most people don't recognize. "Maybe that's why we're fatigued all the time," he mused-from "the effort" of disregarding the greater need of others. "Maybe that's why we break down and suffer depressions: we have a sense that there's something we should be remembering and we're not. Maybe that's what we should be remembering-that other people are suffering."
He discussed the idea of kidney donation with his family and friends. "I thought, at first, that people would understand," Kravinsky told me. "But they don't understand math. That's an American pastime-grossly misunderstanding math. I've had to repeat it over and over. Some people eventually got it. But many people felt the way my wife did: she said, 'No matter how infinitesimal the risk to your family, we're your family, and the recipient doesn't count.' "
Arguments about philanthropic extremes tend to be arguments about families. In "Bleak House," Dickens says of his character Mrs. Jellyby that she "could see nothing nearer than Africa": in a home full of trash, she is so busy helping the unfortunate abroad that she disregards her children, who are filthy and covered with bruises-the "notched memoranda of their accidents." As Esther Summerson, the novel's moral center, says of Mrs. Jellyby, "It is right to begin with the obligations of home. . . . While those are overlooked and neglected, no other duties can possibly be substituted for them." This is a reasonable case for philanthropic restraint, but it's also an excuse for philanthropic inaction: the narrator of Nick Hornby's novel "How to Be Good" wrestles with this argument, guiltily, after her husband makes a sudden conversion to virtue-giving away money and goods, and offering their spare room to a homeless teen-ager. "I'm a liberal's worst nightmare," her husband says, in response to the narrator's Esther-like fears for her children's comfort. "I think everything you think. But I'm going to walk it like I talk it."
Chuck Collins, a great-grandson of Oscar Mayer, is a rare nonfictional example of someone who gave away all his assets during his lifetime-a half-million-dollar inheritance, which he donated to charity nearly twenty years ago. He became used to hearing pleas in behalf of his (then only potential) offspring. "People would say, 'That's fine, you can be reckless in your own life, but you shouldn't do that to your children,' " Collins told me. "But I think parents make decisions for their kids all the time-that's what parenting is." He now has a daughter, who does not live like a Jellyby. "Of course, we have to respond to our immediate family, but, once they're O.K., we need to expand the circle. A larger sense of family is a radical idea, but we get into trouble as a society when we don't see that we're in the same boat."
Kravinsky's conversations with his family, and about his family, left him feeling like an alien. "The sacrosanct commitment to the family is the rationalization for all manner of greed and selfishness," he said. "Nobody says, 'I'm working for the tobacco company because I like the money.' They say, 'Well, you know, I hate to do it, but I'm saving up for the kids.' Everything is excused that way. To me, it's obscene."
During one of our conversations, I asked Kravinsky to calculate a ratio between his love for his children and his love for unknown children. Many people would refuse to engage in this kind of thought experiment, but Kravinsky paused for only a moment. "I don't know where I'd set it, but I would not let many children die so my kids could live," he said. "I don't think that two kids should die so that one of my kids has comfort, and I don't know that two children should die so that one of my kids lives."
Judith Jarvis Thomson, a philosopher at M.I.T. and the author of "The Realm of Rights," later told me, "His children are presumably no more valuable to the universe than anybody else's children are, but the universe doesn't really care about _any_ children-yours or mine or anybody else's. A father who says, 'I'm no more concerned about my children's lives than about anybody else's life' is just flatly a defective parent; he's deficient in views that parents ought to have, whether it maximizes utility or not."
Someone who knows both Kravinskys well told me, "If your spouse is doing something to himself, he is, to a certain extent, doing it to you also. Zell would be an exasperating person to be married to." Susan Katz, the wife of Kravinsky's friend Barry Katz, told me, "I thought he was crazy. I thought it was just weird. If you're a father, you can't put your life at risk." Kravinsky said that his wife's initial attitude echoed these sentiments-she was "adamantly opposed," on the ground of familial responsibility. She eventually grew more accepting of the idea, at least in the abstract. During a recent telephone conversation in which her anger about Zell's actions was made clear, Emily disputed this description, saying that her opposition was constant, and derived from her opinion that Zell, who has digestive difficulties, was unsuited to an operation of this kind. "I have no objection to nondirected organ donations," she said. "I think they're a very good thing, if the donor is medically appropriate for elective surgery, and if the donation is carried out in a medical center that's prepared to provide good care."
The rest was math and poetry: Kravinsky has said that he was driven by "the mathematical calculus of utilitarianism," which gives primacy to the idea of the "greatest good." But he acknowledges, too, another impulse, which emanated from what he calls his romantic or neurotic self: to give a kidney was a self-sacrificing, self-dramatizing act. The utilitarian in Kravinsky might give up his coat to a stranger, if to have no coat would not disable him as a champion of the coatless; but the romantic in Kravinsky would give the coat unquestioningly, loudly renounce coat-wearing worldwide, and then give away his pants.
In April, 2003, Kravinsky called the Albert Einstein Medical Center, an inner-city hospital where he could be fairly confident that a donated kidney would go to a low-income African-American patient. Kravinsky told me that the transplant coordinator who spoke to him was "pretty leery of the whole thing, and kept telling me there was no payment." The hospital had never operated on a nondirected donor. But he went there to meet a surgeon, who believed Kravinsky's reports of two Ph.D.s and his philanthropy only after doing a Google search, and then a psychiatrist, who told him, "You're doing something you don't have to do." Kravinsky replied, "I _do_ have to do it. You're missing the whole point. It's as much a necessity as food, water, and air."
Kravinsky acknowledged that he suffered from depression and that he did not have his wife's approval for the donation. He allowed the hospital to speak to his own psychiatrist, but said that he would not be able to bring Emily in for joint consultations. The hospital accepted this, after officials learned that family support of nondirected donors is often hesitant, at best. "The consensus was, if this is what he wants to do and he's a competent individual, you can't deny him because someone doesn't want him to do it," Radi Zaki, the director of the Center for Renal Disease at Albert Einstein, said. "But we made the process hard for him. We delayed, we put him off. The more impatient he got, the more delay I gave him. You want to make sure this is the real deal."
In June, Kravinsky was accepted for the operation. Donnell Reid, a twenty-nine-year-old single black woman studying for a degree in social work, whose hypertension had forced her to undergo dialysis for eight years, was informed that she was the possible recipient of a kidney from a nondirected donor. "It was so surreal," she recently told me. "You're going about your life, and then you get this phone call." She went in for tests, then waited. "I prayed. I left it in God's hands." She told none of her friends: "It was such an overwhelming thing, such an awesome thing, I wanted to meditate on it on my own." A week later, on July 7th, she learned that she had been selected for the operation, and the next day, at Kravinsky's request, they met at the transplant center. They talked for two hours. She described her plans for the future, and thanked him for a generosity "beyond words."
On July 22nd, Kravinsky left home early-"I snuck out"-and drove to the hospital, where Zaki asked him again if he would like to reconsider his decision. "He was very calm," Zaki recalls. Kravinsky had not told his wife the details of his plan, but he had approached a reporter at the Philadelphia _Daily News,_ a local tabloid, which ran a story that morning. In a three-hour operation that started at 8 a.m.-a laparoscopic removal, requiring minimal incisions-he gave up his right kidney to Reid, who was in the room next door.
The next morning, Kravinsky called his wife from his hospital bed. Because of his digestive complications, he had to be taken off opiate-based painkillers, and he says that he took nothing in their place. (Zaki, affectionately describing Kravinsky as a "dramatic" patient, disputed this memory of total abstinence.) Zell asked Emily for help: "She was furious. She didn't want me to die, but, on the other hand, she was beyond human rage." She said that she was willing to talk to the doctors about his treatment. She also threatened to divorce him.
At that moment, Kravinsky recalled, "I really thought I might have shot it with my family." His parents were also appalled. When Reeda Kravinsky visited her son in the hospital, she recalled, "I was so filled with anger that I didn't speak." Meanwhile, Kravinsky's mind was still turning on philanthropic questions. "I lay there in the hospital, and I thought about all my other organs. When I do something good, I feel that I can do more; I burn to do more. It's a heady feeling." He went home after four days, and by then he was wondering if he should give away his other kidney.
A few weeks ago, in a Barnes &amp; Noble bookstore in Jenkintown, Kravinsky pulled out his shirt a couple of inches and showed me a tidy scar, no more than six inches long, on his right hip. "Once in a while, I remember I only have one kidney," he said, smiling-apparently struck anew by the thought that the donation had been a surgical act as well as a symbolic one. "It feels a little weird-'Oh, yeah, I only have one!'-but the other body parts are very happy, they have breathing room." It was an unusually upbeat thought, connected to a moment of moral clarity. "It was a good deed," he said. "However I screw up morally in the future, this is something nobody can take away."
Kravinsky's mood had improved since our first meeting, four months after the operation, when he had seemed vulnerable. He was always engaging, eccentric company-during lunch at a restaurant, he opened twenty packets of sugar and poured the contents into his mouth; he gave the impression that he would rather wait forever in a stationary elevator than be the one to press the button-but he was dispirited. He often spoke in fateful terms about his marriage, which had held together but was under constant stress. He worried about his relationship with his children-he showed me touching poems he had written about them-and their relationship with the world. (In the schoolyard, a child had approached one of his sons, saying, "Why don't you just donate me that cheese stick?") He said he had lost his sense of direction. "I feel unmoored," he told me.
Having redefined his life as a continuing donation, but having given away everything that came immediately to hand, Kravinsky was not sure how to proceed. His utilitarian and romantic selves were now in competition, and he did not trust his ability to distinguish between the two, or to distinguish between them and vanity. He saw a baffling choice between engagement and disengagement, between creating wealth and withdrawing into a life of poverty. When Kravinsky's thoughts migrated to rhetorical extremes, the choice seemed to be between life and death.
Several times, Kravinsky talked of giving away his other kidney and living on dialysis, and then he would upbraid himself for hesitating. "If I didn't have kids, and I saw a child who was dying for want of a kidney, I would offer mine," he said. He sometimes imagined a full-body donation. "My organs could save several people if I gave my whole body away," he told me. "But I don't think I can do that to my family. Or, at least, I can't endure the humiliation. I've thought about it: my kids would be under a cloud, everybody would pillory me as a showboat or a suicide. I know it's a thing I ought to do; other lives are equal to my own, and I could save at least three or four. I have fantasized about it. I've dreamed about it. But I don't have the nerve." He said that "before it happened I'd have to endure the screams and yells from my family. Then I would be committed." He laughed. "My wife and my sister are psychiatrists."
Kravinsky could see one clear role for himself: as a promoter of a free market in kidneys, an idea with limited but growing intellectual support. Richard A. Epstein, a libertarian law professor at the University of Chicago, championed this argument in his 1999 book, "Mortal Peril: Our Inalienable Right to Health Care?," urging a "frontal assault on the present legal regime" and its "moral philosophy of false comradeship." He wrote, "No one disputes the Beatles' proposition that 'money can't buy you love,' but the proposition does not require any form of _ban_ on the payment of cash in certain human relations." Epstein recently told me, "When I talk about this now, nobody treats me as a complete kook. People are a little more respectful." In Kravinsky's opinion, an efficient market would quickly set a price for a kidney at ten thousand dollars or so. "College kids would do it. A college kid goes to a party, there's a greater risk of dying from drugs or alcohol or a car crash than one in four thousand." He said that any anxiety about exploitation was misplaced: "If the risk is lower than the other ways to make the money, where's the exploitation? How dare people be so condescending."
A few weeks after this discussion, Kravinsky called me. He had just been approached by a local woman in her forties who had spent years on dialysis, and who was running out of places on the body where a dialysis needle can enter a vein. She wanted to buy a kidney. Not long before, two young women had jointly written to Kravinsky; both were interested in selling a kidney. He told me that he had arranged to bring the women together at a cafe near his house. He would be an unpaid broker in a kidney sale. "I'll take the heat, which will probably mean getting arrested," he said. (The 1984 National Organ Transplantation Act prohibits the sale of kidneys.) "I feel very nervous, but I feel the decision's been made-because I'm not going to let that woman die, and who else in America would do this? I'm the only person who can save her life by setting this up. I'm not going to do anything that stands in the way of saving a life, whether it's my money, my reputation. It's a very big step, but there's no choice. The choice is, I say no and the rest of my life I know that someone died."
He called me when he got home, a few hours later. "Oh, brother, she's in bad shape," he said of the would-be recipient. He said that "everyone had liked each other" at the meeting, and an agreement had been reached. The recipient would take a kidney from whichever of the two women was a better match: both would present themselves to a hospital as friends offering a donation. The sick woman had agreed to pay fifty thousand dollars for the organ.
Kravinsky was energized-he foresaw a test case, a shift in public opinion. He was ready to embrace infamy. But when we spoke again he was worried about legal consequences. "Can you imagine _me_ in prison for five years?" he asked.
Later, when I brought up the subject once more, he said that a lawyer had told him to "just leave it alone." He was taking every opportunity to promote kidney donation, but he had given up the role of broker. Today, the three women remain in touch, but they have not yet closed a deal.
According to Kravinsky, his family was living on about sixty thousand dollars a year, from Emily's part-time medical practice and from interest derived from Zell's remaining capital. The children were in public schools; the minivans were paid for. "The real test of my vanity would be if I gave everything away," Kravinsky said. "Not just to the point of a working-class existence but to the point of poverty."
Yet even while Kravinsky aspired to a life spent "passing out pamphlets on the subway," as he put it, it pained him to think of giving up the language of finance, which he spoke so well. "To really achieve wealth, you have to have a love of money-you have to enjoy the play of numbers behind your eyelids," he said.
Indeed, near the end of last year, Kravinsky had begun talking to a local venture capitalist; together they planned a real-estate partnership that would invest on behalf of others in the kind of commercial property that Kravinsky had experience buying and selling. He would give his half of the shares to charity. Other charities could invest without paying fees. Kravinsky initially talked of this as a single stratum in a layered life of agitation, donation, and sacrifice, but this spring, as he began to talk to real-estate agents, the partnership began to emerge as a new full-time job. His mood lightened, and he seemed giddy whenever I overheard him using the jargon of amortization, appraisals, and conduit financing. "I do feel a kind of bonhomie-it's strange-in business," he admitted.
Not long ago, Kravinsky toured a Cingular wireless-call center in eastern Kentucky, a building being offered at thirteen million dollars. His guide, the office manager, was a young tanned woman who wore pin-striped trousers. Kravinsky, bouncy and a little flirtatious, looked like a graduate student in geology, and, as he walked among the thousand desks laid out in honeycomb arrangements under signs reading "I Am Proud to Be Part of the Perfectly Awesome Crew," everyone looked up. "It's just a glue-down carpet?" he asked the office manager. "Are these load-bearing walls? Is that eight inches of concrete?" At the end of the tour, he said, with feeling, "This is a beautiful center, I have to say."
He was no longer adrift, yet he had not discovered ethical ecstasy, either. Peter Singer has called him "a remarkable person who has taken very seriously questions of what are our moral obligations to assist people." He says, "I think it's very difficult for people to go as far as he has, and I don't think we should blame people who don't, but we should admire those who do."
Kravinsky himself held on to self-doubt. He did buy the Kentucky call center; soon afterward, he spent the night at the sunny, high-ceilinged home of Barry and Susan Katz, in Westport, Connecticut. He got up late, and, long after his friends had finished breakfast, he sat eating cereal at the head of a polished black table. He was unrested, and was troubled by the thought that a renewed career in real estate might block his path to virtue.
"But don't you think giving away forty-five million dollars was a good first step?" Barry Katz asked him, taking up the challenge of having moral absolutism as a weekend house guest.
"No," Kravinsky replied. "That's not the hard part. The hard part is the last ten thousand dollars a year-when you have to live so cheaply you can't function in the business world." He added, "If I need a coat to visit an investment banker's office because I'll look bizarre if I don't have one, but then I see somebody shiver, I should give my coat to him."
"But what if you made enough money, after meeting with the investment banker, to fund research into aids prevention, something extremely good for the world?" Katz asked. "You're not going to get very far in an investment banker's office wearing sackcloth."
"I think suits are despicable. Suits and ties. I think I should go into the office naked." Kravinsky was smiling. "If I went into the office of a banker naked, I'd be . . ."
"You'd be arrested," Katz said.
Katz remembered the time he had hung up on Kravinsky a few weeks before the kidney operation. "He almost broke off with you," Susan Katz told Kravinsky.
"Oh, Barry," Kravinsky said. "It isn't that I think people are evil. But it's a fact that our actions, in some sense our thoughts, let some people live and some people die."
Susan, sitting at the other end of the table, looked at Kravinsky with fond exasperation and asked, "This is how you think every day, really? That's got to be tough. It seems so sad. You seem so sad."
"Well, I am sad." Kravinsky had arranged everything within arm's reach-orange juice, mug, salt, sugar, cereal box-into a tight cluster on his placemat. His adventure in donation had been a rhetorical opportunity-a showcase for his underappreciated talent for argument. But for a moment the debate had slowed, and Kravinsky spoke less forcefully, in apparent recognition of the unequal ratio of sacrifice to sustenance, of good done to moral certainty felt.
"But shouldn't there be more joy in this?" Barry said.
"I don't think of it as something that's joyful. Why should I feel joy?"
"I just feel that if you really were on this path to enlightenment, whatever it is, you would feel joy."
"It's not enlightenment," Kravinsky said quietly. "It's the start of a moral life."]]></description></item><item><title>The most important questions and problems</title><link>https://stafforini.com/notes/the-most-important-questions-and-problems/</link><pubDate>Sun, 20 Aug 2017 00:00:00 +0000</pubDate><guid>https://stafforini.com/notes/the-most-important-questions-and-problems/</guid><description>&lt;![CDATA[What are the most important questions to answer? What are the most important problems to solve? Various people and organizations in the effective altruist community have over the years compiled lists of such questions and problems. This post provides links and brief descriptions of all the lists I'm currently aware of. (Note that many of these lists focus on specific causes, such as artificial intelligence, or on specific disciplines, such as moral philosophy.)
_Update: This post was originally written in 2017. Michael Aird has more recently compiled a very comprehensive [list of open research questions](https://forum.effectivealtruism.org/posts/MsNpJBzv5YhdfNHc9/a-central-directory-for-open-research-questions), which largely supersedes the present list (though Aird's directory excludes some entries included here)._
_Further update: 80,000 hours has now released an impressive [list of research questions that could have a big social impact, organised by discipline](https://80000hours.org/articles/research-questions-by-discipline/)_.
[80,000 Hours](https://80000hours.org/articles/cause-selection/) {#hours}
A ranking of the top 10 most pressing global problems, rated by scale, neglectedness, and tractability.
[80,000 Hours](https://forum.effectivealtruism.org/posts/xoxbDsKGvHpkGfw9R/) {#hours-1}
A more extensive list of problem areas outside those listed above.
[AI Impacts](https://aiimpacts.org/promising-research-projects/) {#ai-impacts}
A list of tractable and important AI-relevant projects. See also their [list of key questions of interest](https://aiimpacts.org/ai-impacts-key-questions-of-interest/) and their [list of possible empirical investigations](https://aiimpacts.org/possible-investigations/).
[Center for Reducing Suffering](https://centerforreducingsuffering.org/open-research-questions/) {#center-for-reducing-suffering}
A list of research directions relevant for reducing suffering.
[Center on Long-Term Risk](https://longtermrisk.org/open-research-questions/) {#center-on-long-term-risk}
A comprehensive ranking of open research questions, rated by importance.
[Future of Life Institute](http://futureoflife.org/data/documents/research_survey.pdf) {#future-of-life-institute}
A survey of research questions for robust and beneficial AI.
[Global Priorities Institute](https://globalprioritiesinstitute.org/wp-content/uploads/gpi-research-agenda.pdf) {#global-priorities-institute}
A detailed list of important problems.
[Open Philanthropy Project](http://www.openphilanthropy.org/blog/technical-and-philosophical-questions-might-affect-our-grantmaking) {#open-philanthropy-project}
A list of technical and philosophical questions that could influence OpenPhil's grantmaking strategy.
[Nick Beckstead](http://www.nickbeckstead.com/advice/ea-research-topics) {#nick-beckstead}
A list of valuable research questions, with a focus on the long term. See also Nick's presentation on '[Jobs I wish EAs would do](https://drive.google.com/file/d/0B8_48dde-9C3dUNSVUhQWkdtMDlsa3RweE03QnJXYy0za1JN/view)'.
[Wei Dai](https://www.alignmentforum.org/posts/rASeoR7iZ9Fokzh7L/problems-in-ai-alignment-that-philosophers-could-potentially) {#wei-dai}
A list of problems in AI Alignment that philosophers could potentially contribute to.
[Robin Hanson](https://www.overcomingbias.com/2020/11/join-the-universe.html) {#robin-hanson}
A list of 40 or so "big" questions. See also his [list of important and neglected problems](https://www.overcomingbias.com/2017/02/neglected-big-problems.html).
[**Jamie Harris**](https://www.sentienceinstitute.org/blog/prioritization-questions-for-artificial-sentience) {#jamie-harris}
A list of possible crucial considerations for artificial sentience.
[Holden Karnofsky](https://forum.effectivealtruism.org/posts/zGiD94SHwQ9MwPyfW/important-actionable-research-questions-for-the-most) {#holden-karnofsky}
A list of important, actionable research questions given that the present century could be the most pivotal in history.
[Will MacAskill](https://80000hours.org/2012/10/the-most-important-unsolved-problems-in-ethics/) {#will-macaskill}
A list of the most important unresolved problems in moral philosophy.
[Luke Muehlhauser](http://lukemuehlhauser.com/some-studies-which-could-improve-our-strategic-picture-of-superintelligence/) {#luke-muehlhauser}
A comprehensive list of potential studies that could, if carried out, illuminate our strategic situation with regard to superintelligence. See also [this early post](https://www.lesswrong.com/posts/i2XoqtYEykc4XWp9B/ai-risk-and-opportunity-a-strategic-analysis).
[Richard Ngo](https://forum.effectivealtruism.org/posts/2e9NDGiXt8PjjbTMC/technical-agi-safety-research-outside-ai) {#richard-ngo}
A list of questions whose answers would be useful for technical AGI safety research, but which will probably require expertise outside AI to answer.
[Jess Riedel](https://blog.jessriedel.com/2016/03/15/physicswell/) {#jess-riedel}
A list of topics in physics that should be funded on the margin right now by someone trying to maximize positive impact for society.
[Anders Sandberg](http://aleph.se/andart2/human-development/best-problems-to-work-on/) {#anders-sandberg}
A short list of the best problems to work on, intended as a supplement to 80,000 Hours' ranking. See also Anders' final answer in [this interview](https://intelligence.org/2014/03/02/anders-sandberg/), which mentions the research questions that he believes are most relevant to space colonization.]]></description></item></channel></rss>