DHQ: Digital Humanities Quarterly
Editorial
Test Article: Code Samples
Abstract
A test article for code syntax highlighting
\RepeatMsg "D. C. al Coda"
\RepeatMsg
"Fine"
On the morning when 10,000 started out from Selma, the people sang about \quoted{that
great gettin' up
morning}. \Dots. They sang \wtitle{Oh, Freedom} and various folksongs, but again
and again they came back to
\wtitle{We Shall Overcome}, making up hundreds of verses to fit the simple
melody.\autocite[472]{Southern:BlackAmericans}
\usepackage{semantic-markup}
\sigla{D-Mbs}{Mus. ms. 1234}
\section{Introduction}
\allsectionsfont{\sffamily}
\NewDocumentCommand{\sigla}{ m m }{\textit{#1}: #2}
\include
~\lib\ly
print("hello
world");
<note pname="c" oct="4" accid="s" dur="4" dots="1" />
31A23
31A25
31A26
31A27
31A23
31A27
31A25
31A26
Q4502142
Q15709879
int winter = 1; do { Random r = new Random(); string[] actions = new string[] { "walk", "hop", "speak" }; actions = actions.OrderBy(x => r.Next()).ToArray(); foreach (string doAction in actions) { makeWisakecahk(doAction); } winter--; } while (winter == 1);
413 for (int x = 0; x < WORLD_W; x += crimeRateMap.MAP_BLOCKSIZE) { 414 for (int y = 0; y < WORLD_H; y += crimeRateMap.MAP_BLOCKSIZE) { 415 int z = landValueMap.worldGet(x, y); 416 if (z > 0) { 417 ++numz; 418 z = 128 - z; 419 z += populationDensityMap.worldGet(x, y); 420 z = min(z, 300); 421 z -= policeStationMap.worldGet(x, y); 422 z = clamp(z, 0, 250); 423 crimeRateMap.worldSet(x, y, (Byte)z); 424 totz += z;
} else { crimeRateMap.worldSet(x, y, 0);
if (landValueFlag) { /* LandValue Equation */ dis = 34 - getCityCenterDistance(worldX, worldY) / 2; dis = dis <<2; dis += terrainDensityMap.get(x >>1, y >>1); dis -= pollutionDensityMap.get(x, y); if (crimeRateMap.get(x, y) > 190) { dis -= 20; }
109 if (!getRandom(DisChance[x])) { 110 switch (getRandom(8)) { 111 case 0: 112 case 1: 113 setFire(); // 2/9 chance a fire breaks out 114 break; 115 case 2: 116 case 3: 117 makeFlood(); // 2/9 chance for a flood 118 break; 119 case 4: 120 // 1/9 chance nothing happens (was airplane crash, 121 // which EA removed after 9/11, and requested it be 122 // removed from this code) 123 break; 124 case 5: 125 makeTornado(); // 1/9 chance tornado 126 break; 127 case 6: 128 makeEarthquake(); // 1/9 chance earthquake 129 break; 130 case 7: 131 case 8: 132 // 2/9 chance a scary monster arrives in a dirty town 133 if (pollutionAverage > /* 80 */ 60) { 134 makeMonster(); 135 } 136 break;
case 4: // 1/9 chance nothing happens (was airplane crash, // which EA removed after 9/11, and requested it be // removed from this code) break;
(game "Tic-Tac-Toe" (players 2) (equipment { (board (square 3)) (piece "Disc" P1) (piece "Cross" P2) }) (rules (play (move Add (to (sites Empty)))) (end (if (is Line 3) (result Mover Win))) ) )
pip install silabeador fonemas stanza==1.7.0 libEscansion
import stanza stanza.download(lang="es", package=None, processors={"ner": "ancora", "tokenize": "ancora", "pos": "ancora", "lemma": "ancora", "mwt":"ancora", "constituency":"combined_charlm", "depparse": "ancora", "sentiment": "tass2020"})
<p> Why there was no room in <em>six volumes</em> of Taruskin's <cite>Oxford History of Western Music</cite> to include a single reference to Louis Armstrong? </p>
:c1 a skos:Concept ; skos:prefLabel "c1"@en ; skos:inScheme :sbjf_1 ; skos:definition "A battery-electric car that is capable of traveling at a maximum speed of 25 miles per hour (mph) and has a maximum loaded weight of 3,000 lbs."@en ; skos:definition "éhicule à deux places, activé par un moteur électrique à courant continue alimenté par des batteries au plomb rechargeables à partir d'une prise de courant résidentielle de 110 volts."@fr . :sbjf_1 a skos:ConceptScheme ; skos:prefLabel "e-mobility"@en .
:enLex a lime:Lexicon ; lime:language "en" ; lime:entry :t1, :t2 . :frLex a lime:Lexicon ; lime:language "fr" ; lime:entry :t3 . :itLex a lime:Lexicon ; lime:language "it" ; lime:entry :t4, :t5 .
:t1 rdf:type ontolex:LexicalEntry ; lexinfo:partOfSpeech lexinfo:noun ; lexinfo:normativeAuthorization lexinfo:preferredTerm ; lexinfo:termType lexinfo:fullForm ; ontolex:canonicalForm :t1_cf ; ontolex:sense :t1_sense . :t1_cf rdf:type ontolex:Form ; ontolex:writtenRep "neighborhood electric vehicle"@en . :t1_sense rdf:type ontolex:LexicalSense ; skos:definition [ rdf:value "A battery-electric ..." ; dct:source "TechTarget" ; rdfs:seeAlso <https://www.techtarget.com/whatis/definition/ neighborhood-electric-vehicle-NEV> ] ; ontolex:usage [ rdf:value "A Neighborhood Electric Vehicle is a U.S. category for ..." ; dct:source "Wikipedia" ; rdfs:seeAlso <https://en.wikipedia.org/wiki/Neighborhood_Electric_Vehicle> ] .
intertextualRelation’(T, T)
intertextualRelation(i, T, T, …)
intertextualRelation(i, T, T, S+, ...)
CALL SHIFT(A(J+1),7*(I-6),YY)
CALL SHIFT(A(J+1),7*(K-6),YY) SUBROUTINE SHIFT (VAL,DIST,RES) IMPLICIT INTEGER (A-Z) RES=VAL IF(DIST)10,20,30 10 IDIST=-DIST DO 11 I=1,IDIST J = 0 IF (RES.LT.0) J="200000000000 11 RES = ((RES.AND."377777777777)/2) + J 20 RETURN 30 DO 31 I=1,DIST j = 0 IF ((RES.AND."200000000000).NE.0) J="400000000000 31 RES = (RES.AND."177777777777)*2 + J RETURN END
1001 LTEXT(I)=0 I=1 CALL IFILE(1,'TEXT')
word = URLDecoder.decode(word, "utf-8"); word = new String(word.getBytes("8859_1"),"UTF-8"); if (language.equals(Language.GREEK)) { word = GreekEncodingAnalyzer.transcode(word, "PerseusBetaCode"); }
/* Completely rebuilt this part on may 10th, 2008, in order to make the class * more robust. Before it assumed that certain elements would be contained in * a particular order, without any intervening xml tags that might be added * by various applications. (Such as G7towin, which produces gpx files that * this class threw up on.) I tried to make it more robust by looking for tags * and accepting their data if they contained "wpt" data, and just ignoring * anything else it finds. Brett */
54 …[the software] should now handle any gpx file containing 55 * waypoints, regardless of other junk in the file.
public TBMIDlet () {
// private String URL = "http://internetjunkee.com/transborder/GPScourseFinal.gpx";
TBGpxParser gpxParser = new TBGpxParser(TBGpxParser.RES, "../../../../" + gpxFileName);
301. // make sure the data is not expired 302. if (isExpired) { 303. display.setCurrent(expired); 304. display.vibrate(1000); 305. playAudioFile("expired.wav", true); 306. } else if (expireWarning) { 307. display.setCurrent(expirationWarning, tbDowsingCompass); 308. display.vibrate(1000); 309. playAudioFile("expiration.wav", true);
310. } else if (startUpAlert) { //first time only 311. startUpAlert = false; 312. display.setCurrent(startUpDisplay, tbDowsingCompass); 313. display.vibrate(1000); 314. playAudioFile("startup.wav", true); 315. } else { // we are good to go 316. display.setCurrent(tbDowsingCompass); 317. }
126. if (exp <= System.currentTimeMillis()) { 127. expired = new Alert(translation.translate("Data expired"), 128. translation.translate("The data is expired, TBTool is not safe to use."), 129. errorImage, 130. AlertType.ERROR); 131. expired.setTimeout(Alert.FOREVER); 132. expired.addCommand(exit); 133. expired.addCommand(ignore); 134. expired.setCommandListener(this); 135. isExpired = true;
157. noNearbyWaypoints = new Alert(translation.translate("No Nearby Points"),
//spreadsheet variables - edit these to point to your spreadsheet var spreadsheetID = "1iwIx8sqiBTqbtAI7TX4vDGDVee3dPz0wa3MazBv2fzQ"; var siteSheet = '1569296108'; var pagesSheet = '28080804'; var itemsSheet = '0';
//function to get queries from the URL and get the URL slug for the requested page function getQueries() { var queryString = window.location.search;//get queries from current URL if(queryString) { var urlParams = new URLSearchParams(queryString);//get queries as search parameters openPage = urlParams.get('page').replace('%20',' ');//set the openPage variable to the value from the current URL } };
//Connect to Google Visualization API and query the Site sheet to retrieve basic site configuration var query = new google.visualization.Query('https://docs.google.com/spreadsheets/d/'+spreadsheetID+'/gviz/tq?gid='+siteSheet'&headers=1'); query.send(function (response) { //error handling if (response.isError()) { console.log('Error in query: ' + response.getMessage() + ' ' + response()); Return; }; var siteDT = response.getDataTable();//get datatable from response var siteJsonData = siteDT.toJSON();//convert datatable to JSON siteJsonData = JSON.parse(siteJsonData);//parse JSON siteConfig(siteJsonData);//configure site using data from spreadsheet //end of site configuration
//run Rondo page as static page using local data $.getJSON("json/site.json", function(siteJsonData) {//get site data siteConfig(siteJsonData);//configure site using site data $.getJSON("json/pages.json", function(pagesJsonData) {//get pages data pages(pagesJsonData);//create pages $.getJSON("json/items.json", function(itemsJsonData) {//get items data itemsDataTable(itemsJsonData);//build the items table });//end of items });//end of pages });//end of configuration
var test = 4 tst + 6 Uncaught ReferenceError: tst is not Error: "tst" was not defined defined at <anonymous>:1:1
var shortPhrase = ['circle on', 'dash on', 'let them', 'listen now', 'loop on', 'oh time', 'plunge on', 'reel on', 'roll on', 'run on', 'spool on', 'steady', 'swerve me?', 'turn on', 'wheel on', 'whirl on', 'you — too — ', 'fast-fish', 'loose-fish']; var dickinsonNoun = [ ['air', 'art', 'care', 'door', 'dust', 'each', 'ear', 'earth', 'fair', 'faith', 'fear', 'friend', 'gold', 'grace', 'grass', 'grave', 'hand', 'hill', 'house', 'joy', 'keep', 'leg', 'might', 'mind', 'morn', 'name', 'need', 'noon', 'pain', 'place', 'play', 'rest', 'rose', 'show', 'sight', 'sky', 'snow', 'star', 'thought', 'tree', 'well', 'wind', 'world', 'year'], ['again', 'alone', 'better', 'beyond', 'delight', 'dying', 'easy', 'enough', 'ever', 'father', 'flower', 'further', 'himself', 'human', 'morning', 'myself', 'power', 'purple', 'single', 'spirit', 'today'], ['another', 'paradise'], ['eternity'], ['immortality'] ]; var courseStart = ['fix upon the ', 'cut to fit the ', 'how to withstand the']; var dickinsonSyllable = ['bard', 'bead', 'bee', 'bin', 'bliss', 'blot', 'blur', 'buzz', 'curl', 'dirt', 'disk', 'doll', 'drum', 'fern', 'film', 'folk', 'germ', 'hive', 'hood', 'husk', 'jay', 'pink', 'plot', 'spun', 'toll', 'web']; var melvilleSyllable = ['ash', 'bag', 'buck', 'bull', 'bunk', 'cane', 'chap', 'chop', 'clam', 'cock', 'cone', 'dash', 'dock', 'edge', 'eel', 'fin', 'goat', 'hag', 'hawk', 'hook', 'hoop', 'horn', 'howl', 'iron', 'jack', 'jaw', 'kick', 'kin', 'lime', 'loon', 'lurk', 'milk', 'net', 'pike', 'rag', 'rail', 'ram', 'sack', 'salt', 'tool']; var dickinsonLessLess = [ ['art', 'base', 'blame', 'crumb', 'cure', 'date', 'death', 'drought', 'fail', 'flesh', 'floor', 'foot', 'frame', 'fruit', 'goal', 'grasp', 'guile', 'guilt', 'hue', 'key', 'league', 'list', 'need', 'note', 'pang', 'pause', 'phrase', 'pier', 'plash', 'price', 'shame', 'shape', 'sight', 'sound', 'star', 'stem', 'stint', 'stir', 'stop', 'swerve', 'tale', 'taste', 'thread', 'worth'], ['arrest', 'blanket', 'concern', 'costume', 'cypher', 'degree', 'desire', 'dower', 'efface', 'enchant', 'escape', 'fashion', 'flavor', 'honor', 'kinsman', 'marrow', 'perceive', 'perturb', 'plummet', 'postpone', 'recall', 'record', 'reduce', 'repeal', 'report', 'retrieve', 'tenant'], ['latitude', 'retriever'] ]; var upVerb = ['bask', 'chime', 'dance', 'go', 'leave', 'move', 'rise', 'sing', 'speak', 'step', 'turn', 'walk']; var butBeginning = ['but', 'for', 'then']; var butEnding = ['earth', 'sea', 'sky', 'sun']; var nailedEnding = ['coffin', 'deck', 'desk', 'groove', 'mast', 'spar', 'pole', 'plank', 'rail', 'room', 'sash'];
function exclaimLine(n) { var a, b = n % twoSyllable.length; n = Math.floor(n / twoSyllable.length); a = n % threeToFiveSyllable.length; return threeToFiveSyllable[a] + '! ' + twoSyllable[b] + '!'; }
// The function nailedLine() produces a line beginning "nailed to the ..." // In Moby-Dick, Ahab nails a doubloon to the mast, offering it as a reward // to the one who sees the white whale first. This line template is meant to // semantically mirror an extended attempt to find axial support, both by the // reader of our poem and within Melville's novel, where being "at sea" // involves trying to locate a moral compass, trying to track down a quarry, // trying to control the crew through bribery, and using the mast itself as // a pointer to the stars in 19th-century navigation.
// The array variable shortPhrase contains short phrases, almost all of // which are taken from Melville's Moby-Dick: var shortPhrase = ['circle on', 'dash on', 'let them', 'listen now', 'loop on', 'oh time', 'plunge on', 'reel on', 'roll on', 'run on', 'spool on', 'steady', 'swerve me?', 'turn on', 'wheel on', 'whirl on', 'you — too — ', 'fast-fish', 'loose-fish'];
// The array variable dickinsonNoun contains common nouns from Dickinson's // poems. We judged these nouns as common using a frequency analysis of the // words in the poems.
"fields": [ { "label": "title", "value": "lé de tenture" }, { "label": "Désignation", "value": "lé de tenture" }, { "label": "N° d'inventaire", "value": "VMB 14527" }, { "label": "Domaine", "value": "Textiles" }, { "label": "Date de création", "value": "XVIIIe siècle" } ]
{ "@context": "https://w3id.org/openbadges/v2", "id": "urn:uuid:2d817e2a-278a-40fe-9080-af73e483699b", "type": "Assertion", "recipients": [ { "type": "hash", "identity": "QmUeCAUYSNDK1gDrEGcwzRhbbJxwXAsAu7VN4D2ppzD1gQ", "url": "https://gateway.digitallatin.org/ipfs/QmUeCAUYSNDK1gDrEGcwzRhbbJxwXAsAu7VN4D2ppzD1gQ" }, { "type": "hash", "identity": "QmUeCAUYSNDK1gDrEGcwzRhbbJxwXAsAu7VN4D2ppzD1gQ", "url": "https://gateway.digitallatin.org/ipfs/QmUeCAUYSNDK1gDrEGcwzRhbbJxwXAsAu7VN4D2ppzD1gQ" } ], "issuedOn": "2018-07-30 15:47:19 +0000", "verification": { "type": "signedBadge", "publicKey": "QmZ5MFqMdZDunokFFQq5rKHxtURgjTg8e78cmzCypxcadG", "publicKey-url": "https://gateway.digitallatin.org/ipfs/QmZ5MFqMdZDunokFFQq5rKHxtURgjTg8e78cmzCypxcadG" }, "badge": { "image": "https://dll-review-registry.scta.info/maa-badge-working.svg", "issuer": { "id": "https://www.medievalacademy.org/", "image": "https://pbs.twimg.com/profile_images/1534408703/maa_symbol_small_400x400.png", "type": "Profile", "name": "Medieval Academy of America", "email": "info@themedievalacademy.org", "url": "https://www.medievalacademy.org/" }, "criteria": {"narrative": "Meets the standards of a critical edition; judged by the MAA as equal in quality to the kinds of editions that appear in the MAA printed editions series or scholarly articles that appear in the MAA journal 'Speculum'."} } }
{ "id": "f4b8dc2f-4d41-478d-877b-843b46e21283", "review-society": "MAA", "date": "2018-07-30 11:16:57 UTC", "badge-url": "https://dll-review-registry.digitallatin.org/maa-badge-working.svg", "badge-rubric": "http://dll-review-registry.digitallatin.org/rubric/maa#green", "review-summary": "Review Summary", "sha-256": [ "45304964c8bf9fb63737fa54e701b765baea0d950ff396c8fc686dd9bfda0416", "bab59377f29af62f1091ed367328db4abe96193a01f3b2c0e8615ba861e63317" ], "ipfs-hash": [ "QmdgZhAFTepfXmUsxEGaVampJubZ7XMk4pC42uzHgBEgvy", "QmUxAedP9cDvC9dfwJrezwvHJWVD4w5UaJTHzLn37KEhMa" ], "submitted-url": [ "https://raw.githubusercontent.com/scta-texts/hiltalingencommentary/59eae077a22511531b5da383d402d031ca3b26b7/jhb-l1q10/clm26711_jhb-l1q10.xml", "https://raw.githubusercontent.com/scta-texts/hiltalingencommentary/59eae077a22511531b5da383d402d031ca3b26b7/jhb-l1q9/clm26711_jhb-l1q9.xml" ], "submitted-by": "jeffreycwitt@gmail.com", "cert-ipfs-hash": "Qmbbotxgr1DBXTWXtDEkT2ZNh18ZPm73bHuAymTvNJQVim", "clearsigned-hash": "QmWZDUgZe12ALftn2G5z3GW4KMoLQXM9c4Ysb7yn8cRJTh", "detach-sig-hash": "QmYLprZk5uGJsaLjiKib31hiRdyyAQDuYznuZZpqgQfm7i" }
[ { "id": "52cb5b0a-7ae0-441d-9da5-12534a9a082f", "review-society": "MAA", "date": "2018-07-30 11:15:11 UTC", "badge-url": "http://dll-review-registry.scta.info/maa-badge-working.svg", "badge-rubric": "http://dll-review-registry.scta.info/rubric/maa#green", "review-summary": "review summary", "sha-256": [ "45304964c8bf9fb63737fa54e701b765baea0d950ff396c8fc686dd9bfda0416", "bab59377f29af62f1091ed367328db4abe96193a01f3b2c0e8615ba861e63317" ], "ipfs-hash": [ "QmdgZhAFTepfXmUsxEGaVampJubZ7XMk4pC42uzHgBEgvy", "QmUxAedP9cDvC9dfwJrezwvHJWVD4w5UaJTHzLn37KEhMa" ], "submitted-url": [ "https://raw.githubusercontent.com/scta-texts/hiltalingencommentary/59eae077a22511531b5da383d402d031ca3b26b7/jhb-l1q10/clm26711_jhb-l1q10.xml", "https://raw.githubusercontent.com/scta-texts/hiltalingencommentary/59eae077a22511531b5da383d402d031ca3b26b7/jhb-l1q9/clm26711_jhb-l1q9.xml" ], "submitted-by": "jeffreycwitt@gmail.com", "cert-ipfs-hash": "QmbhSB9CoRgCA3KBh5JXyavSaS2DymBZQQMRsqTiBSEGpC", "clearsigned-hash": "Qmc1165TM3QDsRVFboy56nHSrCF5w2RqkrMDqEoEvvnz3v", "detach-sig-hash": "Qmc1165TM3QDsRVFboy56nHSrCF5w2RqkrMDqEoEvvnz3v" }, { "id": "f4b8dc2f-4d41-478d-877b-843b46e21283", "review-society": "MAA", "date": "2018-07-30 11:16:57 UTC", "badge-url": "http://dll-review-registry.scta.info/maa-badge-working.svg", "badge-rubric": "http://dll-review-registry.scta.info/rubric/maa#green", "review-summary": "review summary", "sha-256": [ "45304964c8bf9fb63737fa54e701b765baea0d950ff396c8fc686dd9bfda0416", "bab59377f29af62f1091ed367328db4abe96193a01f3b2c0e8615ba861e63317" ], "ipfs-hash": [ "QmdgZhAFTepfXmUsxEGaVampJubZ7XMk4pC42uzHgBEgvy", "QmUxAedP9cDvC9dfwJrezwvHJWVD4w5UaJTHzLn37KEhMa" ], "submitted-url": [ "https://raw.githubusercontent.com/scta-texts/hiltalingencommentary/59eae077a22511531b5da383d402d031ca3b26b7/jhb-l1q10/clm26711_jhb-l1q10.xml", "https://raw.githubusercontent.com/scta-texts/hiltalingencommentary/59eae077a22511531b5da383d402d031ca3b26b7/jhb-l1q9/clm26711_jhb-l1q9.xml" ], "submitted-by": "jeffreycwitt@gmail.com", "cert-ipfs-hash": "Qmbbotxgr1DBXTWXtDEkT2ZNh18ZPm73bHuAymTvNJQVim", "clearsigned-hash": "QmWZDUgZe12ALftn2G5z3GW4KMoLQXM9c4Ysb7yn8cRJTh", "detach-sig-hash": "QmYLprZk5uGJsaLjiKib31hiRdyyAQDuYznuZZpqgQfm7i" } ]
{ "verification-message": "gpg: Signature made Mon Jul 30 07:16:58 2018 EDT using RSA key ID 76160352 gpg: Good signature from \"Medieval Academy of America (Keys for Medieval Academy of America)\"" }
{ "Afrique de Sud" : "dbr:South_Africa", "Andorre" : "dbr:Andorra", "Angola" : "dbr:Angola", "Arabie saoudite" : "dbr:Saudi_Arabia", }
\version "2.22" \header { title = "Motherless Child" composer = "African-American spiritual" tagline = ##f % don't add a Lilypond credit to footer } MusicS = { \clef "treble" \time 4/4 \key des\major % that is, d-es, D flat | des''4 f''2 des''4 % c' = Helmhotz C4 | f''2 des''4 es''4 | f''4. es''4. des''4 | bes'2. r4 \bar "|." } LyricsS = \lyricmode { Some -- times I feel like a mo -- ther -- less child } \score { << \new Staff \with { instrumentName = "Soprano" } << \new Voice = "Sop" { \MusicS } \new Lyrics \lyricsto "Sop" { \LyricsS } >> >> \layout { indent = 1\in % first-line indent ragged-right = ##f % fill the whole line } }
\lyricmode { Sanc -- tus, \override Lyrics.LyricText.font-shape = #'italic sanc -- tus \revert Lyrics.LyricText.font-shape sanc -- tus }
\lyricmode { Sanc -- tus, \EdLyrics { sanc -- tus, } sanc -- tus }
\IncipitStaff "TIPLE I-1" "Ti. I-1" { \IncipitSIi }
% Chorus I, Soprano 1 incipit IncipitSIi = { \MSclefGii % treble clef \MeterCZ % Spanish CZ meter symbol (= C3) a''2 % first note }
(define mezzo (lambda (dynamic) (string-append "mezzo" dynamic)))
Section = #(define-scheme-function (SectionText) (markup?) "Print a section title" #{ \once \override Score.RehearsalMark.self-alignment-X = #LEFT \once \override Score.RehearsalMark.padding = #6 \once \override Score.RehearsalMark.outside-staff-priority = #2000 \mark \markup \fontsize #1.5 $SectionText #})
MusicT = { \clef "treble" \MeterTriple | a'2 b'2 c''2 | f'2\color e'1 | a'1.~ | a'2 gis'1\endcolor | b'2 c''2 d''2 | cis''2. cis''4 d''2~\color | d''2 e''1\endcolor }
ColorBracketLeft = \markup { \combine \draw-line #'(0 . -1) \draw-line #'(1.5 . 0) } ColorBracketRight = \markup { \combine \draw-line #'(0 . -1) \draw-line #'(-1.5 . 0) }
DrawColorBrackets = { \override TextSpanner.dash-period = #0 \override TextSpanner.bound-details.left.text = \ColorBracketLeft \override TextSpanner.bound-details.right.text = \ColorBracketRight \override TextSpanner.bound-details.left.attach-dir = #-1 \override TextSpanner.bound-details.right.attach-dir = #1 \override TextSpanner.bound-details.left-broken.text = ##f \override TextSpanner.bound-details.right-broken.text = ##f \override TextSpanner.staff-padding = #2 }
self.__template_word_categories = set([]) self.__toxicity = set(["toxic," "nontoxic"])
for person in root.findall('.//tei:person', tei): person_id = person.get('{http://www.w3.org/XML/1998/namespace}id') person_uri = URIRef(base_uri + '/person/' + person_id) person_ref = '#' + person_id g.add( (person_uri, RDF.type, crm.E21_Person)) same_as = person.get('sameAs').split() if same_as is not None: same_as = same_as.split() i = 0 while i < len(same_as): same_as_uri = URIRef(same_as[i]) g.add( (person_uri, OWL.sameAs, same_as_uri)) i += 1 persname = person.find('./tei:persName', tei) if persname is not None: label = persname.text label_lang = persname.get('{http://www.w3.org/XML/1998/namespace}lang') if label_lang is not None: g.add( (person_uri, RDFS.label, Literal(label, lang=label_lang))) else: g.add( (person_uri, RDFS.label, Literal(label)))
def us_slavery(year): year = int(year) if 1619 <= year <= 1863: self = 'enslaved' else: self = 'free' return self def heir(I): I = 'his sorrowful, terrible heir' return I def lucifer(state): I = 'self' if state == 'no': lucifer = 'Black_Lucifer' return lucifer else: I = heir(I) return I def sleepers(): I = 'hell-name' self = 'Curse' year = int(input('What year is it? ')) self = us_slavery(year) if 1849 <= year < 1881: state = input('Was Black Lucifer dead? ') I = lucifer(state) if I != 'Black_Lucifer': print('I am',I) I = ['the God of Revolt','deathless','sorrowful','vast'] for i in I: print('I am',i) else: if not self: print('REVOLT') elif year >=1881: print('A show of the summer softness\na contact of something unseen\nan amour of the light and air') elif 1776 <=year < 1849: print('We hold these truths to be self-evident\nthat all men are created equal') else: print('Replica of the Great Hole of history') sleepers()
TypeError: us_slavery() missing 2 required positional arguments: 'self' and 'year' NameError: name 'slave' is not defined NameError: name 'I' is not defined Traceback (most recent call last): File "sleepers.py", line 21, in sleepers lucifer = Black_Lucifer NameError: name 'Black_Lucifer' is not defined Traceback (most recent call last): File "sleepers.py", line 40, in sleepers lucifer = lucifer(input("Was Black Lucifer dead? ")) UnboundLocalError: local variable 'lucifer' referenced before assignment
class TreebankUnit(HookTest.units.TESTUnit): tests = ["parsable", "has_root"] readable = { "parsable": "File parsing", "has_root": "Root declared" } def __init__(self, path): super(HookTest.units.TESTUnit, self).__init__(path) def has_root(self): # Process self.log("If something needs to be verbose") has_root = True # Assign result as a boolean yield has_root def test(self, scheme): tests = [] + CTSUnit.tests tests.append(scheme) for testname in tests: # Show the logs and return the status for status in getattr(self, testname)(): yield ( TreebankUnit.readable[testname], status, self.logs ) self.flush()
genderdata::kantrowitz ## Source: local data frame [7,579 x 2] ## ## name gender ## 1 aamir male ## 2 aaron male ## 3 abbey either ## 4 abbie either ## 5 abbot male ## 6 abbott male ## 7 abby either ## 8 abdel male ## 9 abdul male ## 10 abdulkarim male ## .. ... ...
gender ("abby", method = "kantrowitz") ## $name ## [1] "abby" ## ## $gender ## [1] "either"
genderdata::ssa_national ## Source: local data frame [1,603,026 x 4] ## ## name year female male ## 1 aaban 2007 0 5 ## 2 aaban 2009 0 6 ## 3 aaban 2010 0 9 ## 4 aaban 2011 0 11 ## 5 aaban 2012 0 11 ## 6 aabha 2011 7 0 ## 7 aabha 2012 5 0 ## 8 aabid 2003 0 5 ## 9 aabriella 2008 5 0 ## 10 aadam 1987 0 5 ## .. ... ... ... ...
genderdata::ssa_national %>% filter(name == "sidney", year == 1935) ## Source: local data frame [1 x 4] ## ## name year female male ## 1 sidney 1935 93 974
genderdata::ssa_national %>% filter(name == "sidney", year == 1935) %>% mutate(proportion_female = female / (male + female), proportion_male = 1 - proportion_female) ## Source: local data frame [1 x 6] ## ## name year female male proportion_female proportion_male ## 1 sidney 1935 93 974 0.08716026 0.9128397
\begin{verbatim} library("NMF") library("FactoMineR") library("data.table") library("ggplot2") library("cowplot") library("ggdendro") library("pvclust") #data.df<-data.frame(syl_matrix_forR_090416)[1:319,5:14] data.df<-data.frame(syl_matrix_forR_090416)[1:319,4:14] rownames(data.df)<-data.df$Sign.Value data.df<-data.df[,2:11] data.df<-data.df[rowSums(data.df)>0,] #unfiltered w/ Assur colnames(data.df)<-gsub("Esznunna","Eshnunna",colnames(data.df)) rclust<-hclust(dist(data.df,method="manhattan"), method="ward.D2") cclust<-hclust(dist(t(data.df),method="manhattan"), method="ward.D2") aheatmap(data.df,color='grey:2', Rowv=rclust,breaks=c(-0.05,0.5,1.05), labRow=NULL,main="Hierarchical clustering of sites by syllabic value attestations \n", legend=FALSE, fontsize=14, cexCol = 0.8) #unfiltered data w/ Assur forpca<-t(data.df) answer<-PCA(forpca,ncp=10,graph=FALSE) pc.eig.df<-data.frame(answer$eig) ggplot(data=pc.eig.df[c(1:9),], aes(x=gsub("comp","Comp.", rownames(pc.eig.df[c(1:9),])), y=percentage.of.variance)) + geom_bar(stat="identity" +xlab("\nPrincipal Component") +ylab("Percentage of Variance") +ggtitle("Variance distribution across principal components derived from viable syllabic values") coord.rs.df<-data.frame(answer$ind$coord) gsub("Esznunna","E?nunna",row.names(coord.rs.df)) Geography<-factor(c("Syria", "Syria", "Syria", "Syria", "Southern Mesopotamia", "Southern Mesopotamia", "Southern Mesopotamia", "Southern Mesopotamia", "Northern Mesopotamia", "Northern Mesopotamia"), levels=c("Southern Mesopotamia", "Northern Mesopotamia", "Syria")) Period<-factor(c("Old Akkadian\n(ca. 2350-2200 BC)", "\nUr III/ Shakkanakku\n(ca. 2100-2000 BC)", "Old Akkadian\n(ca. 2350-2200 BC)", "\nEarly Old Babylonian\n(ca. 2000-1900 BC)\n", "\nOld Akkadian & Ur III\n(ca. 2350-2200, 2100-2000 BC)", "Old Akkadian \n(ca. 2350-2200 BC)", "Old Akkadian\n(ca. 2350-2200 BC)", "Old Akkadian\n(ca. 2350-2200 BC)", "Old Akkadian\n(ca. 2350-2200 BC)", "Old Akkadian\n(ca. 2350-2200 BC)"), levels=c("Old Akkadian\n(ca. 2350-2200 BC)", "\nOld Akkadian & Ur III\n(ca. 2350-2200, 2100-2000 BC)", "\nUr III / Shakkanakku \n(ca. 2100-2000 BC)", "\n Early Old Babylonian \n(ca. 2000-1900 BC)\n")) ggplot(data=coord.rs.df, aes(x=Dim.1, y=Dim.2))+geom_point(aes(shape=Period, fill=Geography, color=Geography),size=4)+scale_shape_manual (values=c(21,22,23,24))+xlab("Princ. Comp. 1")+ylab("Princ. Comp. 2") +geom_text(label=gsub("Esznunna","E?nunna", row.names(coord.rs.df)), nudge_y=0.7) #filtering out hapax signs and allsites signs data2.df<-data.df[rowSums(data.df)>1 & rowSums(data.df)<9,-9] colnames(data2.df)<-gsub("Esznunna","Eshnunna",colnames(data2.df)) colnames(data.df)<-gsub("Esznunna","Eshnunna",colnames(data.df)) #convert table into final table for thesis for hapax signs data.onesite.df<-data.df[rowSums(data.df)==1,] data.onesite.df$Site<-"DUMMY" data.onesite.df[data.onesite.df$Ebla==1,]$Site<-"Ebla" data.onesite.df[data.onesite.df$Mari==1,]$Site<-"Mari" data.onesite.df[data.onesite.df$Nabada==1,]$Site<-"Nabada" data.onesite.df[data.onesite.df$Tuttul==1,]$Site<-"Tuttul" data.onesite.df[data.onesite.df$Adab==1,]$Site<-"Adab" data.onesite.df[data.onesite.df$E?nunna==1,]$Site<-"E?nunna" data.onesite.df[data.onesite.df$Kish==1,]$Site<-"Kish" data.onesite.df[data.onesite.df$Tutub==1,]$Site<-"Tutub" #data.onesite.df[data.onesite.df$Assur==1,]$Site<-"Assur" data.onesite.df[data.onesite.df$Gasur==1,]$Site<-"Gasur" final.onesite.dt<-data.table(data.onesite.df, final.onesite.dt<-final.onesite.dt[order(Site)] keep.rownames=TRUE)[,.(rn, Site)] write.table(final.onesite.dt, file="hapax_signs.xls", quote=FALSE, sep="\t", row.names=FALSE) #table of signs that occur at all sites data.allsites.df<-data.df[rowSums(data.df[,-9])==9,-9] write.table(rownames(data.allsites.df), file="allsites_signs.xls", quote=FALSE, sep="\t", row.names=FALSE) #table of filtered data write.table(rownames(data2.df),file="data_filtered.xls", quote=FALSE, sep="\t", row.names=TRUE, col.names=TRUE) write.table(data2.df,file="data_filtered.xls",sep="\t") #hierarchical clustering colnames(data2.df)<-gsub("Esznunna","Eshnunna",colnames(data2.df)) rclust<-hclust(dist(data2.df,method="manhattan"), method="ward.D2") cclust<-hclust(dist(t(data2.df),method="manhattan"), method="ward.D2") aheatmap(data2.df,color='grey:2', Rowv=rclust,breaks=c(-0.05,0.5,1.05), labRow=NULL, legend=FALSE, fontsize=14, cexCol = 0.8) result <-pvclust(data2.df, method.dist="manhattan", method.hclust="ward.D2", nboot=10000) plot(result) #PCA colnames(data2.df)<-gsub("E?nunna","Eshnunna",colnames(data2.df)) forpca<-t(data2.df) answer<-PCA(forpca,ncp=3,graph=FALSE) pc.eig.df<-data.frame(answer$eig) ggplot(data=pc.eig.df[c(1:8),], aes(x=gsub("comp","Comp.", rownames(pc.eig.df[c(1:8),])), y=percentage.of.variance)) + geom_bar(stat="identity")+xlab("\nPrincipal Component") + ylab("Percentage of Variance")+ggtitle("Variance distribution across principal components \n derived from informative syllabic values \n") coord.rs.df<-data.frame(answer$ind$coord) #meta.pca.df<-meta.df[rownames(coord.rs.df),] #PC1 gsub("Esznunna","Eshnunna", row.names(coord.rs.df)) Geography<-factor(c("Syria", "Syria", "Syria", "Syria", "Southern Mesopotamia", "Southern Mesopotamia", "Southern Mesopotamia", "Southern Mesopotamia", "Northern Mesopotamia"), levels=c("Southern Mesopotamia", "Northern Mesopotamia", "Syria")) Period<-factor(c("Old Akkadian\n(ca. 2350-2200 BC)", "\nUr III / Shakkanakku \n(ca. 2100-2000 BC)", "Old Akkadian\n(ca. 2350-2200 BC)", "\nEarly Old Babylonian\n(ca. 2000-1900 BC)\n", "\nOld Akkadian & Ur III \n(ca. 2350-2200, 2100-2000 BC)", "Old Akkadian\n(ca. 2350-2200 BC)", "Old Akkadian \n(ca. 2350-2200 BC)", "Old Akkadian\n(ca. 2350-2200 BC)", "Old Akkadian\n(ca. 2350-2200 BC)"), levels=c("Old Akkadian \n(ca. 2350-2200 BC)", "\nOld Akkadian & Ur III\n(ca. 2350-2200, 2100-2000 BC)", "\nUr III / Shakkanakku\n(ca. 2100-2000 BC)", "\nEarly Old Babylonian \n(ca. 2000-1900 BC) \n")) ggplot(data=coord.rs.df, aes(x=Dim.1, y=Dim.2))+geom_point(aes (shape=Period, fill=Geography, color=Geography),size=4) +scale_shape_manual(values=c(21,22,23,24))+xlab("Princ. Comp. 1")+ylab("Princ. Comp. 2")+geom_text(label=gsub ("Esznunna","Eshnunna", row.names(coord.rs.df)),nudge_y=0.5) ggplot(data=coord.rs.df, aes(x=Dim.2, y=Dim.3))+geom_point(aes (shape=Period, fill=Geography, color=Geography),size=4) +scale_shape_manual(values=c(21,22,23,24))+xlab( "Princ. Comp. 2")+ylab("Princ. Comp. 3+geom_text(label=gsub ("Esznunna","Eshnunna", row.names(coord.rs.df)), nudge_y=0.5) ggplot(data=coord.rs.df, aes(x=Dim.1, y=Dim.3))+geom_point(aes(shape=Period, fill=Geography, color=Geography),size=4)+scale_shape_manual (values=c(21,22,23,24))+ xlab("Princ. Comp. 1")+ylab("Princ. Comp. 3"+geom_text(label=gsub("Esznunna","Eshnunna", row.names (coord.rs.df)), nudge_y=0.5) #Extract attestations that define principal components 1-3 signs<-data.table(answer$var$contrib, keep.rownames = TRUE) #dim1 excel table and visualization dim1<-signs[order(-abs(Dim.1))][,1:2,with=FALSE] ggplot(data=dim1, aes(x=seq(from=1, to=188,by=1),y=Dim.1)) +geom_point(size=1, color="black")+xlab("\nSyllabic value index ordered by loadings on the first principal component")+ylab("Loadings on the first principal component\n")+ggtitle("The distribution of loadings for syllabic values suggests that\nloadings greater than 1.2 should be further examined.") +geom_hline(yintercept = 1.25) dim1<-dim1[dim1$Dim.1>1.1,] data2.dim1.df<-data2.df[dim1$rn,] colnames(data2.dim1.df)<-gsub("Esznunna","Eshnunna",colnames(data2.dim1.df)) rclust<-hclust(dist(data2.dim1.df,method="manhattan"), method="ward.D2") cclust<-hclust(dist(t(data2.dim1.df),method="manhattan"), method="ward.D2") anngeo<-list(Geography=Geography) aheatmap(data2.dim1.df,color='grey:2', Rowv=rclust,breaks=c(-0.05,0.5,1.05), annCol = anngeo, legend=FALSE,main="Hierarchical clustering of sites by syllabic value attestations\nimportant in the first principal component", fontsize=10,treeheight=10, cexCol = 1, cexRow=2) write.table(dim1, file="dim1_signloadings.xls", quote=FALSE, sep="\t", row.names=FALSE) #dim2 excel table and visualization dim2<-signs[order(-abs(Dim.2))][,c(1,3),with=FALSE] ggplot(data=dim2, aes(x=seq(from=1, to=188,by=1),y=Dim.2)) + geom_point( size=1, color="black")+xlab("\nSyllabic value index ordered by loadings on the second principal component")+ylab("Loadings on the second principal component\n")+ggtitle("The distribution of loadings for syllabic values suggests that\nloadings greater than 1.1 should be further examined.")+geom_hline(yintercept = 1.1) dim2<-dim2[dim2$Dim.2>1.1,] data2.dim2.df<-data2.df[dim2$rn,] colnames(data2.dim2.df)<-gsub("Esznunna","Eshnunna",colnames(data2.dim2.df)) rclust<-hclust(dist(data2.dim2.df,method="manhattan"), method="ward.D2") cclust<-hclust(dist(t(data2.dim2.df),method="manhattan"), method="ward.D2") annperiod<-list(Period=gsub("^ ","",gsub("\n"," ", Period))) aheatmap(data2.dim2.df,color='grey:2', Rowv=rclust,breaks=c(-0.05,0.5,1.05), annCol=annperiod, legend=FALSE,main="Hierarchical clustering of sites by syllabic value attestations\nimportant in the second principal component", fontsize=10,treeheight=10, cexCol = 1, cexRow=3) write.table(dim2, file="dim2_signloadings.xls", quote=FALSE, sep="\t", row.names=FALSE) #dim3 excel table and visualization dim3<-signs[order(-abs(Dim.3))][,c(1,4),with=FALSE] ggplot(data=dim3, aes(x=seq(from=1, to=188,by=1),y=Dim.3)) +geom_point(size=1, color="black")+xlab("\nSyllabic value index ordered by loadings on the third principal component")+ylab("Loadings on the third principal component\n")+ggtitle("The distribution of loadings for syllabic values suggests that\nloadings greater than 1.3 should be further examined."+geom_hline(yintercept = 1.3) dim3<-dim3[dim3$Dim.3>1.3,] data2.dim3.df<-data2.df[dim3$rn,] colnames(data2.dim3.df)<-gsub("Esznunna","Eshnunna",colnames(data2.dim3.df)) rclust<-hclust(dist(data2.dim3.df,method="manhattan"), method="ward.D2") cclust<-hclust(dist(t(data2.dim3.df),method="manhattan"), method="ward.D2") aheatmap(data2.dim3.df,color='grey:2', Rowv=rclust,breaks=c(-0.05,0.5,1.05), legend=FALSE,main="HierarchicaL clustering of sites by syllabic value attestations \nimportant in the third principal component", fontsize=10, treeheight=10, cexCol = 1, cexRow=2) write.table(dim3, file="dim3_signloadings.xls", quote=FALSE, sep="\t", row.names=FALSE)
for a in authors: for p in listOfPatternsOrderedByDecreasingContribution: n = getNearestNovel(p) add p to listOfPatterns[a] if listOfPatterns[a] has length = 5: exit
create view t311 FROM ( select case when cell_id is not null then '4' else '4' end as city , cell_id , date_trunc('hour', "Requested Date/Time") as hr , case when t311_all."Service Name" = 'Graffiti Removal' then 1 else 0 end as graf , case when t311_all."Service Name" = 'Illegal Dumping' then 1 else 0 end as illdumping , case when t311_all."Service Name" = 'Maintenance Residential or Commercial' then 1 else 0 end as bldgmaint --, null as streetsw --, null as electrical , case when t311_all."Service Name" = 'Sanitation / Dumpster Violation' then 1 else 0 end as sanitation --, null as recycling , case when t311_all."Service Name" = 'Street Trees' then 1 else 0 end as tree , case when t311_all."Service Name" = 'Traffic (Other)' then 1 else 0 end as traffic , case when t311_all."Service Name" = ' Vacant House or Commercial' then 1 else 0 end as vac_bldg , case when t311_all."Service Name" = 'Abandoned Vehicle' then 1 else 0 end as vac_vehicle , case when t311_all."Service Name" = 'Street Light Outage' then 1 else 0 end as stlghts , case when t311_all."Service Name" = 'Alley Light Outage ' then 1 else 0 end as alleylghts --, null as potholes , case when t311_all."Service Name" = 'Rubbish/Recyclable Material Collection' then 1 else 0 end as garbage_pickup --, null as rodent --, null as sidewalk
FROM ( select case when cell_id is not null then '4' else '4' end as city , cell_id , date_trunc('hour', "Requested Date/Time") as hr , case when t311_all."Service Name" = 'Graffiti Removal' then 1 else 0 end as graf , case when t311_all."Service Name" = 'Illegal Dumping' then 1 else 0 end as illdumping , case when t311_all."Service Name" = 'Maintenance Residential or Commercial' then 1 else 0 end as bldgmaint
--, null as streetsw --, null as electrical
, case when t311_all."Service Name" = 'Sanitation / Dumpster Violation' then 1 else 0 end as sanitation --, null as recycling , case when t311_all."Service Name" = 'Street Trees' then 1 else 0 end as tree , case when t311_all."Service Name" = 'Traffic (Other)' then 1 else 0 end as traffic , case when t311_all."Service Name" = ' Vacant House or Commercial' then 1 else 0 end as vac_bldg , case when t311_all."Service Name" = 'Abandoned Vehicle' then 1 else 0 end as vac_vehicle , case when t311_all."Service Name" = 'Street Light Outage' then 1 else 0 end as stlghts , case when t311_all."Service Name" = 'Alley Light Outage ' then 1 else 0 end as alleylghts --, null as potholes , case when t311_all."Service Name" = 'Rubbish/Recyclable Material Collection' then 1 else 0 end as garbage_pickup --, null as rodent --, null as sidewalk
AS select city, cell_id, hr, sum(graf) as graf, sum(illdumping) as illdumping, sum(bldgmaint) as bldgmaint , sum(sanitation) as sanitation, sum(traffic) as traffic, sum(vac_bldg) as vac_bldg , sum(vac_vehicle) as vac_vehicle, sum(stlghts) as stlghts, sum(alleylghts) as alleylghts , sum(garbage_pickup) as garbage_pickup , sum(graf) + sum(illdumping) + sum(bldgmaint) + sum(sanitation) + sum(traffic) + sum(vac_bldg) + sum(vac_vehicle) + sum(stlghts) + sum(alleylghts) + sum(garbage_pickup) as t311Events
, case when t311_all."Service Name" = 'Street Trees' then 1 else 0 end as tree
<tbx xml:lang="en" style="dct" type="TBX-Min" > <text> <body> <conceptEntry id="1"> <langSec xml:lang="en"> <termSec> <term>mouse pad</term> </termSec> <termSec> <term>mouse mat</term> </termSec> </langSec> <langSec xml:lang="es"> <termSec> <term>soporte para ratón</term> </termSec> <termSec> <term>alfombrilla de ratón</term> </termSec> </langSec> <langSec xml:lang="it"> <termSec> <term>tappetino per mouse</term> </termSec> </langSec> </conceptEntry> <conceptEntry id="2"> ... </conceptEntry> </body> </text> </tbx>
<conceptEntry id="cl"> <min:subjectField>e-mobility</min:subjectFi.eld> <langSec xml:lang="en"> <descripGrp> <basic:definition>A battery-electric car that is capable of traveling at a maximum speed of 25 miles per hour (mph) and has a maximum loaded weight of 3,000 lbs. </basic:definition> <basic:source>TechTarget</basic:source> <min:externalCrossReference>https://www.techtarget.com/whatis/definition/ neighborhood-electric-vehicle-NEV</min:externalCrossReference> </descripGrp> <termSec> <term>neighborhood electric vehicle</term> <basic:termType>fullForm</basic:termType> <min:partOfSpeech>noun</min:partOfSpeech> <min:administrativeStatus>preferredTerm-admn-sts</min:administrativeStatus> <descripGrp> <basic:context>A Neighborhood Electric Vehicle (NEV) is a U.S. category for battery electric vehicles that are usually built to have a top speed of 25 miles per hour (40 km/h), and have a maximum loaded weight of 3,000 lb (1,400 kg). </basic:context> <basic:source>Wikipedia</basic:source> <min:externalCrossReference>https://en.wi.kipedia.org/wiki/ Neighborhood_Electric_Vehicle</min:externalCrossReference> </descripGrp> </termSec> <termSec> <term>NEV</term> <basic:termType>acronym</basic:termType> <min:partOfSpeech>noun</min:partOfSpeech> <min:administrativeStatus>admittedTerm-admn-sts</min:administrativeStatus> </termSec> </langSec> <langSec xml:lang="fr"> ... </langSec xml:lang="fr"> <langSec xml:lang="it"> ... </langSec xml:lang="it"> </conceptEntry>
<l>a)/ndra moi e)/nnepe, mou=sa, polu/tropon, o(\s ma/la polla\</l>
<l>mh=ter e)pei/ m' e)/teke/s ge minunqa/dio/n per e)o/nta,</l> <l>timh/n pe/r moi o)/fellen *)olu/mpios e)gguali/cai</l> <l>*zeu\s u(yibreme/ths: nu=n d' ou)de/ me tutqo\n e)/tisen:</l> <l n="355">h)= ga/r m' *)atrei/+dhs eu)ru\ krei/wn *)agame/mnwn</l> <l>h)ti/mhsen: e(lw\n ga\r e)/xei ge/ras au)to\s a)pou/ras.</l>
<lb/><milestone ed="P" unit="para" />kou/rh d' *)wkeanou=, *xrusa/ori karteroqu/mw| <lb rend="displayNum" n="980" />mixqei=s' e)n filo/thti poluxru/sou *)afrodi/ths, <lb />*kalliro/h te/ke pai=da brotw=n ka/rtiston a(pa/ntwn, <lb />*ghruone/a, to\n ktei=ne bi/h *(hraklhei/h <lb />bow=n e(/nek' ei)lipo/dwn a)mfirru/tw| ei)n *)eruqei/h|.
—- title: Bol author: Faiz Ahmed Faiz —- Bol kih lab… (Poem text is here in the body )
—- en: title: Faiz Ahmed Faiz bday: 1911.02.13 body: Urdu poet hi: title: "फ़ैज़ अहमद फ़ैज़" body: उर्दू शायर ur: title: فیض احمد فیض body: اردو شاعر —-
—- en: title: Bol author: - Faiz Ahmed Faiz body: |- bol kih lab aazaad hai tere bol zabaa;n ab tak terii hai hi: author: - Faiz Ahmed Faiz title: बोल body: |- बोल कि लब आज़ाद है तेरे बोल ज़बाँ अब तक तेरी है ur: author: - Faiz Ahmed Faiz title: بول body: |- بول کہ لب آزاد ہے تیرے بول زباں اب تک تیری ہے —-
<http://dbpedia.org/resource/Alexander_the_Great> # Alexander the Great <http://dbpedia.org/ontology/parent> # has parent <http://dbpedia.org/resource/Philip_II_of_Macedon> # Philip II of Macedon
<person sameAs="http://viaf.org/viaf/101353608">Alexander the Great</person>
<text lang="eng"> <text lang="eng"> </text> </text>



