Knowledge Graph Creation: Part II


Knowledge Graph Creation: Part II



https://medium.com/analytics-**vidhya**/knowledge-graph-creation-part-ii-675fa480773a

Ein Versuch

!pip install textpipeliner
!python -m spacy download de_core_news_sm
import spacy
import de_core_news_sm
nlp = de_core_news_sm.load()
from textpipeliner import PipelineEngine, Context
from textpipeliner.pipes import *
doc = nlp(u"""Die meisten Menschen zeigen sich kooperativ.
  Die allermeisten Leute zeigten sich kooperativ.""")
pipes_structure = [FindTokensPipe("[AUX,VERB]/*/[NOUN]"),
FindTokensPipe(
"[AUX,VERB]"),
FindTokensPipe(
"[AUX,VERB]/*/ADJ")]

engine = PipelineEngine(pipes_structure, Context(doc), [0,1,2])
process = engine.process()
process
[([Menschen], [zeigen], [kooperativ]), ([Leute], [zeigten], [kooperativ])]
 
 
...
 
doc = nlp(u""" ... """)
 
pipes_structure = [FindTokensPipe("[AUX,VERB]/*/NOUN"),
FindTokensPipe(
"[AUX,VERB]"),
FindTokensPipe(
"[AUX,VERB]/*/ADJ")]

engine = PipelineEngine(pipes_structure, Context(doc), [0,1,2])
process_ = engine.process()

process = []
for proc in process_:
if len(proc[0]) > 1 or len(proc[1]) > 1 or len(proc[2]) > 1:
print('Skipped!')
else:
process.append([str(proc[0][0]), str(proc[1][0]), str(proc[2][0])])
process_df = pd.DataFrame(process)
process_df.columns = ['source', 'edge', 'target']
import networkx as nx
import matplotlib.pyplot as plt
%matplotlib inline

G=nx.from_pandas_edgelist(process_df, "source", "target",
edge_attr=
True, create_using=nx.MultiDiGraph())

plt.figure(figsize=(12,12))
pos = nx.spring_layout(G)

nx.draw(G, with_labels=True, node_color='skyblue',
edge_cmap=plt.cm.Blues, pos = pos)
nx.draw_networkx_edge_labels(G, pos=pos)

plt.show()
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Ich erkläre es demnächst, sobald ich es selbst besser verstehe ;-).
Bis dahin ein paar Links für Selbststudien:
python - Renaming columns in pandas - Stack Overflow How To Change Column Names and Row Indexes in Pandas? - Python and R Tips Create Your Own Expression Parser - Level Up Coding Create Your Own Expression Parser - Level Up Coding textpipeliner/textpipeliner/tests at master · krzysiekfonal/textpipeliner · GitHub textpipeliner/textpipeliner at master · krzysiekfonal/textpipeliner · GitHub GitHub - krzysiekfonal/textpipeliner: Python library for advanced text mining textpipeliner GitHub - networkx/networkx: Official NetworkX source code repository. NetworkX — NetworkX Welcome to treelib’s documentation! — treelib 1.5.5 documentation tree-format · PyPI print-tree2 · PyPI SpaCy Manual HTML Rendering Looks Wrong - Prodigy Support Linguistic Features · spaCy Usage Documentation Visualizers · spaCy Usage Documentation Visualizers · spaCy Usage Documentation Grammaregex library — regex-like for text mining - Krzysztof Fonał - Medium spaCy · Industrial-strength Natural Language Processing in Python GitHub - krzysiekfonal/grammaregex: Regex like pattern tree matching but on sentence's tree instead of Strings Krzysztof Fonal | AngelList PyDigger - unearthing stuff about Python regex - NLP - Extracting "Correct" Noun Phrases - Stack Overflow grammaregex · PyPI GitHub - PrecedenceBV/grammaregex: Regex like pattern tree matching but on sentence's tree instead of Strings EntityRecognizer · spaCy API Documentation German · spaCy Models Documentation Grammaregex library — regex-like for text mining - Krzysztof Fonał - Medium Token · spaCy API Documentation A short introduction to NLP in Python with spaCy - Towards Data Science spaCy now speaks German · Blog · Explosion DependencyParser · spaCy API Documentation How to extract topics in a sentence and their respective dependent sentences? - codesd.com Main Clauses (Hauptsätze) Annotation Specifications · spaCy API Documentation spaCy 101: Everything you need to know · spaCy Usage Documentation Knowledge Graph Creation: Part II - Analytics Vidhya - Medium Knowledge Graph Creation: Part I - Analytics Vidhya - Medium Auto-Generated Knowledge Graphs - Towards Data Science  
 

Kommentare

Beliebte Posts aus diesem Blog

·

Es brennt.

Bye, bye Nord Stream 2!