\n\n
\n\n\nWe have only one type of node (e.g. Concept) to represent animals and two types of links, (e.g. Inheritance and Similarity) to represent relations between them.\n\n\n### Adding atoms\n\nWe can add nodes explicitly by calling add_node() passing a Python dict representing the node. This dict may contain any number of keys associated to values of any type (including lists, sets, nested dicts, etc) , which are all recorded with the node, but must contain at least the keys type and name mapping to strings which define the node uniquely, i.e. two nodes with the same type and name are considered to be the same entity.\n\nadd_link() works mostly in the same way. For links, the mandatory fields are type and targets, which is a list of Python dicts representing either nodes or nested links.\n\n\n```\npython\n das.add_node({\"type\": \"Concept\", \"name\": \"human\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"monkey\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"chimp\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"mammal\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"reptile\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"snake\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"dinosaur\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"triceratops\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"earthworm\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"rhino\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"vine\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"ent\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"animal\"})\n das.add_node({\"type\": \"Concept\", \"name\": \"plant\"}) ;\n```\n\n\nWe can also add nodes implicitly while adding links.\n\n\n```\npython\ndas.add_link(\n {\n \"type\": \"Similarity\",\n \"targets\": [\n {\"type\": \"Concept\", \"name\": \"human\"},\n {\"type\": \"Concept\", \"name\": \"monkey\"},\n ],\n }\n) ;\n```\n\n\n\"human\" and \"monkey\" would be inserted if they hadn't been inserted before. Adding the node or link more than once is allowed and has no side effects. So let's add the whole set of links from our knowledge base.\n\n\n```\npython\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"human\"}, {\"type\": \"Concept\", \"name\": \"monkey\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"human\"}, {\"type\": \"Concept\", \"name\": \"chimp\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"chimp\"}, {\"type\": \"Concept\", \"name\": \"monkey\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"snake\"}, {\"type\": \"Concept\", \"name\": \"earthworm\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"rhino\"}, {\"type\": \"Concept\", \"name\": \"triceratops\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"snake\"}, {\"type\": \"Concept\", \"name\": \"vine\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"human\"}, {\"type\": \"Concept\", \"name\": \"ent\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"monkey\"}, {\"type\": \"Concept\", \"name\": \"human\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"chimp\"}, {\"type\": \"Concept\", \"name\": \"human\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"monkey\"}, {\"type\": \"Concept\", \"name\": \"chimp\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"earthworm\"}, {\"type\": \"Concept\", \"name\": \"snake\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"triceratops\"}, {\"type\": \"Concept\", \"name\": \"rhino\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"vine\"}, {\"type\": \"Concept\", \"name\": \"snake\"}, ], })\ndas.add_link( { \"type\": \"Similarity\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"ent\"}, {\"type\": \"Concept\", \"name\": \"human\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"human\"}, {\"type\": \"Concept\", \"name\": \"mammal\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"monkey\"}, {\"type\": \"Concept\", \"name\": \"mammal\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"chimp\"}, {\"type\": \"Concept\", \"name\": \"mammal\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"mammal\"}, {\"type\": \"Concept\", \"name\": \"animal\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"reptile\"}, {\"type\": \"Concept\", \"name\": \"animal\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"snake\"}, {\"type\": \"Concept\", \"name\": \"reptile\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"dinosaur\"}, {\"type\": \"Concept\", \"name\": \"reptile\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"triceratops\"}, {\"type\": \"Concept\", \"name\": \"dinosaur\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"earthworm\"}, {\"type\": \"Concept\", \"name\": \"animal\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"rhino\"}, {\"type\": \"Concept\", \"name\": \"mammal\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"vine\"}, {\"type\": \"Concept\", \"name\": \"plant\"}, ], })\ndas.add_link( { \"type\": \"Inheritance\", \"targets\": [ {\"type\": \"Concept\", \"name\": \"ent\"}, {\"type\": \"Concept\", \"name\": \"plant\"}, ], }) ;\n```\n\n\nLinks are always asymetric, so symmetric relationships like \"Similarity\" are represented by adding two links. For instance:\n \n```\ndas.add_link(\n {\n \"type\": \"Similarity\",\n \"targets\": [\n {\"type\": \"Concept\", \"name\": \"human\"},\n {\"type\": \"Concept\", \"name\": \"monkey\"},\n ],\n }\n)\n```\n\n\nand\n\n```\ndas.add_link(\n {\n \"type\": \"Similarity\",\n \"targets\": [\n {\"type\": \"Concept\", \"name\": \"monkey\"},\n {\"type\": \"Concept\", \"name\": \"human\"},\n ],\n }\n)\n```\n\n\nConsidering this, we can print the atom count again.\n\n\n```\npython\nprint(das.count_atoms())\n```\n\n\n (14, 26)\n\n\n\n### Fetching from a DAS server\n\nInstead of adding atoms by calling add_node() and add_link() directly, it's possible to fetch all or part of the contents from a DAS server using the method fetch(). This method doesn't create a lasting connection with the DAS server, it will just fetch the atoms once and close the connection so any subsequent changes or queries will not be propagated to the server in any way. After fetching the atoms, all queries will be made locally. It's possible to call fetch() multiple times fetching from the same DAS Server or from different ones.\n\n\n```\npython\nfrom hyperon_das import DistributedAtomSpace\n\nremote_das_host = \"45.63.85.59\"\nremote_das_port = 8080\n\nimported_das = DistributedAtomSpace()\nprint(imported_das.count_atoms())\n\nlinks_to_import = {\n 'atom_type': 'link',\n 'type': 'Expression',\n 'targets': [\n {'atom_type': 'node', 'type': 'Symbol', 'name': 'Inheritance'},\n {'atom_type': 'variable', 'name': 'v2'},\n {'atom_type': 'variable', 'name': 'v3'},\n ]\n}\n\nimported_das.fetch(links_to_import, remote_das_host, remote_das_port)\nprint(imported_das.count_atoms())\n```\n\n\n (0, 0)\n (15, 12)\n\n\nThe first parameter of fetch() is a pattern to describe which atoms should be fetched. It's exactly the same pattern used to make [pattern matching](#patternmatcher).\n\n\n### Getting atoms by their properties\n\nDAS has an API to query atoms by their properties. Most of this API is based on atom handles. Handles are MD5 signatures associated with atoms. For now they are supposed to be unique ids for atoms although this is not 100% true (conflict handling is planned to be implemented in the near future). DAS provides two static methods to compute handles for nodes and links: das.get_node_handle() and das.get_link_handle().\n\n\n```\npython\nhuman = das.get_node_handle('Concept', 'human')\nent = das.get_node_handle('Concept', 'ent')\n\nprint(\"human:\", human)\nprint(\"ent:\", ent)\n\nsimilarity_link = das.get_link_handle('Similarity', [human, ent])\n\nprint(\"Similarity link:\", similarity_link)\n```\n\n\n human: af12f10f9ae2002a1607ba0b47ba8407\n ent: 4e8e26e3276af8a5c2ac2cc2dc95c6d2\n Similarity link: 16f7e407087bfa0b35b13d13a1aadcae\n\n\nNote that these are static methods which don't actually query the stored atomspace in order to compute those handles. Instead, they just run a MD5 hashing algorithm over the data that uniquely identifies nodes and links, i.e. node type and name in the case of nodes and link type and targets in the case of links. This means e.g. that two nodes with the same type and the same name are considered to be the exact same entity.\n\nAtom handles can be used to retrieve the actual atom document.\n\n\n```\npython\ndas.get_atom(human)\n```\n\n\n\n\n\n {'handle': 'af12f10f9ae2002a1607ba0b47ba8407',\n 'type': 'Concept',\n 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3',\n 'name': 'human',\n 'named_type': 'Concept'}\n\n\n\nConvenience methods can be used to retrieve atoms passing its basic properties instead.\n\n\n```\npython\nprint(\"human:\", das.get_node('Concept', 'human'))\nprint(\"\\nSimilarity link:\", das.get_link('Similarity', [human, ent]))\n```\n\n\n human: {'handle': 'af12f10f9ae2002a1607ba0b47ba8407', 'type': 'Concept', 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3', 'name': 'human', 'named_type': 'Concept'}\n \n Similarity link: {'handle': '16f7e407087bfa0b35b13d13a1aadcae', 'type': 'Similarity', 'composite_type_hash': 'ed73ea081d170e1d89fc950820ce1cee', 'is_toplevel': True, 'composite_type': ['a9dea78180588431ec64d6bc4872fdbc', 'd99a604c79ce3c2e76a2f43488d5d4c3', 'd99a604c79ce3c2e76a2f43488d5d4c3'], 'named_type': 'Similarity', 'named_type_hash': 'a9dea78180588431ec64d6bc4872fdbc', 'targets': ['af12f10f9ae2002a1607ba0b47ba8407', '4e8e26e3276af8a5c2ac2cc2dc95c6d2']}\n\n\nIt's possible to get all links pointing to a specific atom.\n\n\n```\npython\n# All links pointing from/to 'rhino'\n\nrhino = das.get_node_handle('Concept', 'rhino')\nlinks = das.get_incoming_links(rhino)\nfor link in links:\n print(link['type'], link['targets'])\n```\n\n\n Similarity ['d03e59654221c1e8fcda404fd5c8d6cb', '99d18c702e813b07260baf577c60c455']\n Similarity ['99d18c702e813b07260baf577c60c455', 'd03e59654221c1e8fcda404fd5c8d6cb']\n Inheritance ['99d18c702e813b07260baf577c60c455', 'bdfe4e7a431f73386f37c6448afe5840']\n\n\nLinks can also be retrieved by other properties or partial definition of its main properties (type and targets). The method get_links() can be used passing different combinations of parameters.\n\n\n```\npython\n# All inheritance links\n\nlinks = das.get_links(link_type='Inheritance')\nfor link in links:\n print(link['type'], link['targets'])\n```\n\n\n Inheritance ['5b34c54bee150c04f9fa584b899dc030', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['b94941d8cd1c0ee4ad3dd3dcab52b964', '80aff30094874e75028033a38ce677bb']\n Inheritance ['bb34ce95f161a6b37ff54b3d4c817857', '0a32b476852eeb954979b87f5f6cb7af']\n Inheritance ['c1db9b517073e51eb7ef6fed608ec204', 'b99ae727c787f1b13b452fd4c9ce1b9a']\n Inheritance ['bdfe4e7a431f73386f37c6448afe5840', '0a32b476852eeb954979b87f5f6cb7af']\n Inheritance ['1cdffc6b0b89ff41d68bec237481d1e1', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['af12f10f9ae2002a1607ba0b47ba8407', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['b99ae727c787f1b13b452fd4c9ce1b9a', '0a32b476852eeb954979b87f5f6cb7af']\n Inheritance ['4e8e26e3276af8a5c2ac2cc2dc95c6d2', '80aff30094874e75028033a38ce677bb']\n Inheritance ['d03e59654221c1e8fcda404fd5c8d6cb', '08126b066d32ee37743e255a2558cccd']\n Inheritance ['99d18c702e813b07260baf577c60c455', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['08126b066d32ee37743e255a2558cccd', 'b99ae727c787f1b13b452fd4c9ce1b9a']\n\n\n\n```\npython\n# Inheritance links between two Concept nodes\n\nlinks = das.get_links(link_type='Inheritance', target_types=['Concept', 'Concept'])\nfor link in links:\n print(link['type'], link['targets'])\n```\n\n\n Inheritance ['5b34c54bee150c04f9fa584b899dc030', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['b94941d8cd1c0ee4ad3dd3dcab52b964', '80aff30094874e75028033a38ce677bb']\n Inheritance ['bb34ce95f161a6b37ff54b3d4c817857', '0a32b476852eeb954979b87f5f6cb7af']\n Inheritance ['c1db9b517073e51eb7ef6fed608ec204', 'b99ae727c787f1b13b452fd4c9ce1b9a']\n Inheritance ['bdfe4e7a431f73386f37c6448afe5840', '0a32b476852eeb954979b87f5f6cb7af']\n Inheritance ['1cdffc6b0b89ff41d68bec237481d1e1', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['af12f10f9ae2002a1607ba0b47ba8407', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['b99ae727c787f1b13b452fd4c9ce1b9a', '0a32b476852eeb954979b87f5f6cb7af']\n Inheritance ['4e8e26e3276af8a5c2ac2cc2dc95c6d2', '80aff30094874e75028033a38ce677bb']\n Inheritance ['d03e59654221c1e8fcda404fd5c8d6cb', '08126b066d32ee37743e255a2558cccd']\n Inheritance ['99d18c702e813b07260baf577c60c455', 'bdfe4e7a431f73386f37c6448afe5840']\n Inheritance ['08126b066d32ee37743e255a2558cccd', 'b99ae727c787f1b13b452fd4c9ce1b9a']\n\n\n\n```\npython\n# Similarity links where 'snake' is the first target\n\nsnake = das.get_node_handle('Concept', 'snake')\nlinks = das.get_links(link_type='Similarity', link_targets=[snake, '*'])\nfor link in links:\n print(link['type'], link['targets'])\n```\n\n\n Similarity ['c1db9b517073e51eb7ef6fed608ec204', 'b94941d8cd1c0ee4ad3dd3dcab52b964']\n Similarity ['c1db9b517073e51eb7ef6fed608ec204', 'bb34ce95f161a6b37ff54b3d4c817857']\n\n\n\n```\npython\n# Any links where 'snake' is the first target\n\nsnake = das.get_node_handle('Concept', 'snake')\nlinks = das.get_links(link_type='*', link_targets=[snake, '*'])\nfor link in links:\n print(link['type'], link['targets'])\n```\n\n\n Similarity ['c1db9b517073e51eb7ef6fed608ec204', 'b94941d8cd1c0ee4ad3dd3dcab52b964']\n Inheritance ['c1db9b517073e51eb7ef6fed608ec204', 'b99ae727c787f1b13b452fd4c9ce1b9a']\n Similarity ['c1db9b517073e51eb7ef6fed608ec204', 'bb34ce95f161a6b37ff54b3d4c817857']\n\n\n\n### Traversing the hypergraph\n\nIt's possible to traverse the hypergraph using a TraverseEngine which is like a cursor that can be moved through nodes and links. First, let's initiate a TraverseEngine pointing to \"human\". In order to do this, we need to call get_traversal_cursor() passing the handle of the atom to be used as the starting point for the traversing. This atom can be either a link or a node. We'll use the method das.get_node_handle() to get the handle of the Concept \"human\" and start on it.\n\n\n```\npython\ncursor = das.get_traversal_cursor(das.get_node_handle('Concept', 'human'))\n```\n\n\nOnce we have a cursor we can get the whole document of the atom pointed by it:\n\n\n```\npython\ncursor.get()\n```\n\n\n\n\n\n {'handle': 'af12f10f9ae2002a1607ba0b47ba8407',\n 'type': 'Concept',\n 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3',\n 'name': 'human',\n 'named_type': 'Concept'}\n\n\n\nWe can also see all links that make reference to cursor. Optional parameters can be used to filter which links should be considered. Here are some examples. We're printing only link type and targets to make the output cleaner.\n\n\n```\npython\n# All links pointing from/to cursor\nprint(\"All links:\", [(d['type'], d['targets']) for d in cursor.get_links()])\n\n# Only Inheritance links\nprint(\"\\nInheritance links:\", [(d['type'], d['targets']) for d in cursor.get_links(link_type='Inheritance')])\n\n# Links whose first target is our cursor\nprint(\"\\n'human' is first link target:\", [(d['type'], d['targets']) for d in cursor.get_links(cursor_position=0)])\n```\n\n\n All links: [('Similarity', ['af12f10f9ae2002a1607ba0b47ba8407', '4e8e26e3276af8a5c2ac2cc2dc95c6d2']), ('Inheritance', ['af12f10f9ae2002a1607ba0b47ba8407', 'bdfe4e7a431f73386f37c6448afe5840']), ('Similarity', ['af12f10f9ae2002a1607ba0b47ba8407', '5b34c54bee150c04f9fa584b899dc030']), ('Similarity', ['1cdffc6b0b89ff41d68bec237481d1e1', 'af12f10f9ae2002a1607ba0b47ba8407']), ('Similarity', ['4e8e26e3276af8a5c2ac2cc2dc95c6d2', 'af12f10f9ae2002a1607ba0b47ba8407']), ('Similarity', ['af12f10f9ae2002a1607ba0b47ba8407', '1cdffc6b0b89ff41d68bec237481d1e1']), ('Similarity', ['5b34c54bee150c04f9fa584b899dc030', 'af12f10f9ae2002a1607ba0b47ba8407'])]\n \n Inheritance links: [('Inheritance', ['af12f10f9ae2002a1607ba0b47ba8407', 'bdfe4e7a431f73386f37c6448afe5840'])]\n \n 'human' is first link target: [('Similarity', ['af12f10f9ae2002a1607ba0b47ba8407', '4e8e26e3276af8a5c2ac2cc2dc95c6d2']), ('Inheritance', ['af12f10f9ae2002a1607ba0b47ba8407', 'bdfe4e7a431f73386f37c6448afe5840']), ('Similarity', ['af12f10f9ae2002a1607ba0b47ba8407', '5b34c54bee150c04f9fa584b899dc030']), ('Similarity', ['af12f10f9ae2002a1607ba0b47ba8407', '1cdffc6b0b89ff41d68bec237481d1e1'])]\n\n\nThere are other possibilities for filtering such as custom filter methods, target types, etc. They're explained in the [DAS API](https://singnet.github.io/das-query-engine/api/das/).\n\nThere are also convenience methods to get the cursor's \"neighbors\", which are the other atoms pointed by the links attached to the cursor. Let's investigate the neighbors of \"human\". Again, we can use the same filters to select which links and targets to consider in order to get the neighbors of the cursor.\n\n\n```\npython\n# All \"human\" neighbors\nprint(\"All neighbors:\", [(d['type'], d['name']) for d in cursor.get_neighbors()])\n\n# Only neighbors linked through Inheritance links\nprint(\"\\nInheritance relations:\", [(d['type'], d['name']) for d in cursor.get_neighbors(link_type='Inheritance')])\n\n# Only neighbors that are similar to \"human\" (i.e. they share a Similarity link)\nprint(\"\\nSimilar to 'human':\", [(d['type'], d['name']) for d in cursor.get_neighbors(link_type='Similarity', cursor_position=0)])\n```\n\n\n All neighbors: [('Concept', 'ent'), ('Concept', 'mammal'), ('Concept', 'chimp'), ('Concept', 'monkey')]\n \n Inheritance relations: [('Concept', 'mammal')]\n \n Similar to 'human': [('Concept', 'ent'), ('Concept', 'chimp'), ('Concept', 'monkey')]\n\n\n\nget_links() and get_neighbors() use the [DAS Cache system](https://github.com/singnet/das/blob/master/docs/das-overview.md) to sort the atoms before they are returned to the caller. In addition to this, these methods return an iterator rather than an actual list of atoms and this iterator is controlled by the cache system as well. The idea here is that atoms may have a large number of links (and consequently neighbors) attached to it so the AI/ML agent may not be interested in iterating on all of them. Atoms are presented in such a way that high importance atoms tend to be presented first while low importance atoms tend to be presented later.\n\nWe can move the cursor by following its links. \n\n\n```\npython\ncursor = das.get_traversal_cursor(das.get_node_handle('Concept', 'human'))\ncursor.follow_link()\ncursor.get()\n```\n\n\n\n\n\n {'handle': '4e8e26e3276af8a5c2ac2cc2dc95c6d2',\n 'type': 'Concept',\n 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3',\n 'name': 'ent',\n 'named_type': 'Concept'}\n\n\n\nfollow_link() just gets the first link returned by get_links() in order to follow it and select a target. The same filters described above can be used here to constraint the links/targets that will be considered. For instance we could use the following code to get the most abstract concept (considering our Inheritance links) starting from \"human\".\n\n\n```\npython\ncursor = das.get_traversal_cursor(das.get_node_handle('Concept', 'human'))\nbase = cursor.get()['name']\nwhile True:\n print(base)\n cursor.follow_link(link_type='Inheritance', cursor_position=0)\n if cursor.get()['name'] == base:\n break\n base = cursor.get()['name']\ncursor.get()\n```\n\n\n human\n mammal\n animal\n\n\n\n\n\n {'handle': '0a32b476852eeb954979b87f5f6cb7af',\n 'type': 'Concept',\n 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3',\n 'name': 'animal',\n 'named_type': 'Concept'}\n\n\n\n\n### Pattern Matcher Queries\n\nDAS can answer pattern matching queries. These are queries where the caller specifies a _pattern_ i.e. a boolean expression of subgraphs with nodes, links and wildcards and the engine finds every subgraph in the knowledge base that satisfies the passed expression. Patterns are a list of Python dicts describing a subgraph with wildcards.\n\nThe method query() expects a pattern and outputs a list of QueryAnswer. Each element in such a list has the variable assignment that satisfies the pattern and the subgraph which is the pattern itself rewritten using the given assignment.\n\n\n```\npython\n# This is a pattern like:\n#\n# Inheritance\n# v1\n# plant\n#\n# The expected answer is all Inheritance links whose second target == 'plant'\n#\nquery = {\n 'atom_type': 'link',\n 'type': 'Inheritance',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v1'},\n {'atom_type': 'node', 'type': 'Concept', 'name': 'plant'},\n ]\n}\n\nfor query_answer in das.query(query):\n print(query_answer.assignment)\n atom_matching_v1 = das.get_atom(query_answer.assignment.mapping['v1'])\n print(\"v1:\", atom_matching_v1['type'], atom_matching_v1['name'])\n rewrited_query = query_answer.subgraph\n print(rewrited_query)\n print()\n```\n\n\n [('v1', 'b94941d8cd1c0ee4ad3dd3dcab52b964')]\n v1: Concept vine\n {'handle': 'e4685d56969398253b6f77efd21dc347', 'type': 'Inheritance', 'composite_type_hash': '41c082428b28d7e9ea96160f7fd614ad', 'is_toplevel': True, 'composite_type': ['e40489cd1e7102e35469c937e05c8bba', 'd99a604c79ce3c2e76a2f43488d5d4c3', 'd99a604c79ce3c2e76a2f43488d5d4c3'], 'named_type': 'Inheritance', 'named_type_hash': 'e40489cd1e7102e35469c937e05c8bba', 'targets': [{'handle': 'b94941d8cd1c0ee4ad3dd3dcab52b964', 'type': 'Concept', 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3', 'name': 'vine', 'named_type': 'Concept'}, {'handle': '80aff30094874e75028033a38ce677bb', 'type': 'Concept', 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3', 'name': 'plant', 'named_type': 'Concept'}]}\n \n [('v1', '4e8e26e3276af8a5c2ac2cc2dc95c6d2')]\n v1: Concept ent\n {'handle': 'ee1c03e6d1f104ccd811cfbba018451a', 'type': 'Inheritance', 'composite_type_hash': '41c082428b28d7e9ea96160f7fd614ad', 'is_toplevel': True, 'composite_type': ['e40489cd1e7102e35469c937e05c8bba', 'd99a604c79ce3c2e76a2f43488d5d4c3', 'd99a604c79ce3c2e76a2f43488d5d4c3'], 'named_type': 'Inheritance', 'named_type_hash': 'e40489cd1e7102e35469c937e05c8bba', 'targets': [{'handle': '4e8e26e3276af8a5c2ac2cc2dc95c6d2', 'type': 'Concept', 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3', 'name': 'ent', 'named_type': 'Concept'}, {'handle': '80aff30094874e75028033a38ce677bb', 'type': 'Concept', 'composite_type_hash': 'd99a604c79ce3c2e76a2f43488d5d4c3', 'name': 'plant', 'named_type': 'Concept'}]}\n \n\n\n\n```\npython\n# This is a pattern like:\n#\n# AND\n# Inheritance\n# v1\n# mammal\n# Inheritance\n# v2\n# dinosaur\n# Similarity\n# v1\n# v2\n#\n# The expected answer is all pair of animals such that \n# one inherits from mammal, the other inherits from dinosaur \n# and they have a Similarity link between them.\n#\nexp1 = {\n 'atom_type': 'link',\n 'type': 'Inheritance',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v1'},\n {'atom_type': 'node', 'type': 'Concept', 'name': 'mammal'},\n ]\n}\nexp2 = {\n 'atom_type': 'link',\n 'type': 'Inheritance',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v2'},\n {'atom_type': 'node', 'type': 'Concept', 'name': 'dinosaur'},\n ]\n}\nexp3 = {\n 'atom_type': 'link',\n 'type': 'Similarity',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v1'},\n {'atom_type': 'variable', 'name': 'v2'},\n ]\n}\nquery = [exp1, exp2, exp3] # a list of expressions mean an AND of them\n\nfor query_answer in das.query(query):\n print(query_answer.assignment)\n atom_matching_v1 = das.get_atom(query_answer.assignment.mapping['v1'])\n atom_matching_v2 = das.get_atom(query_answer.assignment.mapping['v2'])\n print(\"v1:\", atom_matching_v1['type'], atom_matching_v1['name'])\n print(\"v2:\", atom_matching_v2['type'], atom_matching_v2['name'])\n #rewrited_query = query_answer.subgraph\n #print(rewrited_query)\n print()\n```\n\n\n [('v1', '99d18c702e813b07260baf577c60c455'), ('v2', 'd03e59654221c1e8fcda404fd5c8d6cb')]\n v1: Concept rhino\n v2: Concept triceratops\n \n\n\n\n```\npython\n# This is a pattern like:\n#\n# AND\n# Similarity\n# v1\n# v2\n# Similarity\n# v2\n# v3\n# Similarity\n# v3\n# v1\n#\n# The expected answer is all triplet of animals such that \n# all of them have a Similarity link with the other two.\n#\nexp1 = {\n 'atom_type': 'link',\n 'type': 'Similarity',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v1'},\n {'atom_type': 'variable', 'name': 'v2'},\n ]\n}\nexp2 = {\n 'atom_type': 'link',\n 'type': 'Similarity',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v2'},\n {'atom_type': 'variable', 'name': 'v3'},\n ]\n}\nexp3 = {\n 'atom_type': 'link',\n 'type': 'Similarity',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v3'},\n {'atom_type': 'variable', 'name': 'v1'},\n ]\n}\nquery = [exp1, exp2, exp3] # a list of expressions mean an AND of them\n\nfor query_answer in das.query(query):\n atom_matching_v1 = das.get_atom(query_answer.assignment.mapping['v1'])\n atom_matching_v2 = das.get_atom(query_answer.assignment.mapping['v2'])\n atom_matching_v3 = das.get_atom(query_answer.assignment.mapping['v3'])\n print(\"v1:\", atom_matching_v1['type'], atom_matching_v1['name'])\n print(\"v2:\", atom_matching_v2['type'], atom_matching_v2['name'])\n print(\"v3:\", atom_matching_v3['type'], atom_matching_v3['name'])\n print()\n```\n\n\n v1: Concept monkey\n v2: Concept chimp\n v3: Concept human\n \n v1: Concept human\n v2: Concept monkey\n v3: Concept chimp\n \n v1: Concept chimp\n v2: Concept monkey\n v3: Concept human\n \n v1: Concept monkey\n v2: Concept human\n v3: Concept chimp\n \n v1: Concept human\n v2: Concept chimp\n v3: Concept monkey\n \n v1: Concept chimp\n v2: Concept human\n v3: Concept monkey\n \n\n\n\n## Connecting to a remote DAS\n\nWhen a DAS is instantiated with a remote query engine, it will connect to a DAS Server previously populated with a knowledge base. Atoms in the remote DAS Server become available for fetching, querying and modification.\n\nIn addition to the remote DAS, an internal local DAS is also kept locally. Some of the methods in the API will look for atoms first in this local DAS before going to the remote one. Other methods can be configured to search only in one of them (remote or local) or in both. We'll explain this behavior on a case by case basis.\n\nIn our example, we'll connect to a DAS Server pre-loaded with the following MeTTa expressions:\n\n```\n(: Similarity Type)\n(: Concept Type)\n(: Inheritance Type)\n(: \"human\" Concept)\n(: \"monkey\" Concept)\n(: \"chimp\" Concept)\n(: \"snake\" Concept)\n(: \"earthworm\" Concept)\n(: \"rhino\" Concept)\n(: \"triceratops\" Concept)\n(: \"vine\" Concept)\n(: \"ent\" Concept)\n(: \"mammal\" Concept)\n(: \"animal\" Concept)\n(: \"reptile\" Concept)\n(: \"dinosaur\" Concept)\n(: \"plant\" Concept)\n(Similarity \"human\" \"monkey\")\n(Similarity \"human\" \"chimp\")\n(Similarity \"chimp\" \"monkey\")\n(Similarity \"snake\" \"earthworm\")\n(Similarity \"rhino\" \"triceratops\")\n(Similarity \"snake\" \"vine\")\n(Similarity \"human\" \"ent\")\n(Inheritance \"human\" \"mammal\")\n(Inheritance \"monkey\" \"mammal\")\n(Inheritance \"chimp\" \"mammal\")\n(Inheritance \"mammal\" \"animal\")\n(Inheritance \"reptile\" \"animal\")\n(Inheritance \"snake\" \"reptile\")\n(Inheritance \"dinosaur\" \"reptile\")\n(Inheritance \"triceratops\" \"dinosaur\")\n(Inheritance \"earthworm\" \"animal\")\n(Inheritance \"rhino\" \"mammal\")\n(Inheritance \"vine\" \"plant\")\n(Inheritance \"ent\" \"plant\")\n(Similarity \"monkey\" \"human\")\n(Similarity \"chimp\" \"human\")\n(Similarity \"monkey\" \"chimp\")\n(Similarity \"earthworm\" \"snake\")\n(Similarity \"triceratops\" \"rhino\")\n(Similarity \"vine\" \"snake\")\n(Similarity \"ent\" \"human\")\n```\n\n\nSemantically, this is the same knowledge base we used as an example for a local DAS above. However, the mapping to nodes and links is slightly different as described in the [DAS MeTTa Parser](https://github.com/singnet/das-metta-parser) documentation. For instance, each expression, like:\n\n```\n(Similarity \"ent\" \"human\")\n```\n\n\nis mapped to 4 atoms. 3 nodes and 1 link as follows.\n\n```\n{\n 'type': 'Expression',\n 'targets': [\n {'type': 'Symbol', 'name', 'Similarity'},\n {'type': 'Symbol', 'name', '\"ent\"'},\n {'type': 'Symbol', 'name', '\"human\"'}\n ]\n}\n```\n\n\n\n```\npython\nfrom hyperon_das import DistributedAtomSpace\n\nhost = '45.63.85.59'\nport = '8080'\n\nremote_das = DistributedAtomSpace(query_engine='remote', host=host, port=port)\nprint(f\"Connected to DAS Server at {host}:{port}\")\n\nprint(\"(nodes, links) =\", remote_das.count_atoms())\n```\n\n\n Connected to DAS Server at 45.63.85.59:8080\n (nodes, links) = (23, 60)\n\n\nAtoms can be retrieved by their properties using get_atom(), get_node(), get_link(), get_incoming_links() and get_links() in the same way described [here](#atomquery) for local DAS. The only difference is that the local DAS will be searched first for get_atom(), get_node(), get_link() before going to the remote DAS when the atom is not found locally. get_incoming_links() and get_links() will search in both, local and remote DAS, and return an iterator to the results. As we explain [here](#cache), these iterators use the cache system to sort the results and determine how atoms are fetched from the remote DAS.\n\nadd_node() and add_link() will add atoms only in the local DAS. If you add an atom that already exists in the remote DAS, the local copy is always returned by the methods above. To propagate changes to the remote DAS one needs to call commit(). We'll not provide examples of changes in the remote DAS here because we're using a single DAS Server to serve tests with this animals KB so if you commit changes to it everyone will be affected. So please don't use this notebook to commit changes to our test server.\n\nfetch() also works in the same way (described [here](#fetch)) for a remote DAS. The only difference is that now the caller can omit the parameters for host and port which are defaulted to the connected remote DAS Server. Fetching from a different DAS Server is still possible by setting the proper values for host and port.\n\nIf you execute the cells below you'll notice a delay between each call. This is because the cache system is not in place yet so each call is issuing an actual query to the remote DAS.\n\n\n```\npython\n# Compute the handle and get the actual document for \"symbol\"\nsymbol = '\"earthworm\"'\nsymbol_handle = remote_das.get_node_handle('Symbol', symbol)\nsymbol_document = remote_das.get_atom(symbol_handle)\nsymbol_document\n```\n\n\n\n\n\n {'handle': '665509d366ac3c2821b3b6b266f996bd',\n 'type': 'Symbol',\n 'composite_type_hash': '02c86eb2792f3262c21d030a87e19793',\n 'name': '\"earthworm\"',\n 'named_type': 'Symbol',\n 'is_literal': True}\n\n\n\n\n```\npython\n# Get expressions like (* base_symbol *)\niterator = remote_das.get_links(link_type='Expression', link_targets=['*', symbol_handle, '*'])\nfor link in iterator:\n atom1 = remote_das.get_atom(link['targets'][0])\n atom2 = remote_das.get_atom(link['targets'][2])\n print(f\"({atom1['name']} {symbol} {atom2['name']})\")\n```\n\n\n (: \"earthworm\" Concept)\n (Inheritance \"earthworm\" \"animal\")\n (Similarity \"earthworm\" \"snake\")\n\n\n\n```\npython\n# Re-adding an existing atom with a custom field\nremote_das.add_node(\n {\n 'type': 'Symbol',\n 'name': symbol,\n 'truth_value': tuple([0.1, 0.9])\n }\n)\nremote_das.get_node('Symbol', symbol)\n```\n\n\n\n\n\n {'handle': '665509d366ac3c2821b3b6b266f996bd',\n 'type': 'Symbol',\n 'composite_type_hash': '02c86eb2792f3262c21d030a87e19793',\n 'name': '\"earthworm\"',\n 'named_type': 'Symbol',\n 'truth_value': (0.1, 0.9)}\n\n\n\n\n```\npython\n# Add (to the local DAS only) a new expression mentioning the base_symbol\nremote_das.add_link(\n { \n 'type': 'Expression', \n 'targets': [ \n {'type': 'Symbol', 'name': 'Pos'}, \n {'type': 'Symbol', 'name': symbol},\n {'type': 'Symbol', 'name': 'noun'}\n ]\n }\n)\n# Get expressions like (* base_symbol *) again\niterator = remote_das.get_links(link_type='Expression', link_targets=['*', symbol_handle, '*'])\nfor link in iterator:\n atom1 = remote_das.get_atom(link['targets'][0])\n atom2 = remote_das.get_atom(link['targets'][2])\n print(f\"({atom1['name']} {symbol} {atom2['name']})\")\n```\n\n\n (Pos \"earthworm\" noun)\n (: \"earthworm\" Concept)\n (Inheritance \"earthworm\" \"animal\")\n (Similarity \"earthworm\" \"snake\")\n\n\nThe methods for traversing the hypergraph work basically in the same way as for the local DAS (this is described [here](#traversing)). Because of the way MeTTa expressions are mapped to nodes/links with only one type of node and one type of link, traversing is less intuitive from a human perspective but it still makes sense to implement algorithms. Local and remote DAS are considered by the TraverseEngine and the whole logic of this component is subject to the cache management rules, i.e., the cache will try to pre-fetch atoms and present query answers prioritizing more relevant atoms as the caller navigates through the atomspace hypergraph.\n\n\n### Querying a remote DAS\n\nThe Pattern Matcher in a remote DAS works basically in the same way as in a local DAS (this is described [here](#patternmatcher)). The main difference is the optional parameter query_scope which can be used to define the scope of the query as local_only, remote_only or local_and_remote (its default value is remote_only).\n\n\n```\npython\nquery = {\n 'atom_type': 'link',\n 'type': 'Expression',\n 'targets': [\n {'atom_type': 'variable', 'name': 'v1'},\n {'atom_type': 'node', 'type': 'Symbol', 'name': symbol},\n {'atom_type': 'variable', 'name': 'v2'}\n ]\n}\n\n# The default is to query remote_only\nresults = remote_das.query(query)\nprint(\"Remote only\")\nfor query_answer in results:\n v1_atom = query_answer[1]['targets'][0]\n v2_atom = query_answer[1]['targets'][2]\n print(f\"({v1_atom['name']} {symbol} {v2_atom['name']})\")\n\nresults = remote_das.query(query, {'query_scope': 'local_only'})\nprint()\nprint(\"Local only\")\nfor query_answer in results:\n v1_atom = query_answer.subgraph['targets'][0]\n v2_atom = query_answer.subgraph['targets'][2]\n print(f\"({v1_atom['name']} {symbol} {v2_atom['name']})\")\n\n# local_and_remote is not implemented yet\n#results = remote_das.query(query, {'query_scope': 'local_and_remote'})\n#print(\"Remote + Local\")\n#for query_answer in results:\n# v1_atom = query_answer[1]['targets'][0]\n# v2_atom = query_answer[1]['targets'][2]\n# print(f\"({v1_atom['name']} {symbol} {v2_atom['name']})\")\n```\n\n\n Remote only\n (Inheritance \"earthworm\" \"animal\")\n (: \"earthworm\" Concept)\n (Similarity \"earthworm\" \"snake\")\n \n Local only\n (Pos \"earthworm\" noun)\n\n\n\n### Custom Indexes\n\nRemote DAS allow creation of custom indexes based on custom fields in nodes or links. These indexes can be used to make subsequent custom queries.\n\n\n```\npython\nsymbol_name_index = remote_das.create_field_index('node', 'name', type='Symbol')\nresults = remote_das.custom_query(symbol_name_index, name='\"human\"')\nfor atom in results:\n print(atom['type'], atom['name'])\n```\n\n\n Symbol \"human\"\n\n\nIn this example, we're creating an index for the field name in nodes. name is supposed to be defined in every node of the knowledge base. To create an index on a field which is defined only for a certain type of node, an extra type parameter should be passed to define which type of nodes should enter in the index: e.g. remote_das.create_field_index('node', 'lemma', type='Word') would create an index for the field lemma on all nodes of type Word. This type of index works only for string or number (integer or floating point) fields.\nIndexes for links can be created likewise.\n\n\n## Starting a DAS Server\n\nA DAS Server can be set up using the [DAS Toolbox](https://github.com/singnet/das-toolbox) following these steps:\n\n1. Setup environment variables\n1. Start DB servers\n1. Load MeTTa knowledge base\n1. Start FaaS gateway\n\nFirst, you need to install the latest version of das-cli in your environment. Follow the instructions in the [toolbox repo](https://github.com/singnet/das-toolbox) to make this.\n\nThen we'll start by setting up the environment.\n\n*THE COMMANDS BELOW WILL CREATE FILES IN YOUR FILESYSTEM*.\n\nRun the following cell.\n\n\n```\npython\n!das-cli config list\n```\n\n\nIf it outputs something like this:\n\n```\n+----------+----------------+-----------------------+\n| Service | Name | Value |\n+----------+----------------+-----------------------+\n| redis | port | 29000 |\n| redis | container_name | das-cli-redis-29000 |\n| mongodb | port | 28000 |\n| mongodb | container_name | das-cli-mongodb-28000 |\n| mongodb | username | dbadmin |\n| mongodb | password | dassecret |\n| loader | container_name | das-cli-loader |\n| openfaas | container_name | das-cli-openfaas-8080 |\n+----------+----------------+-----------------------+\n```\n\n\nIt's because you already have a config file in ~/.das. If that's the case you need to decide if you want to re-use the same port numbers or not. It's OK to have several databases in your machine. They are Docker containers listening in the given port.\n\nIf the previous das-cli config list command output is empty, you just need to create a new config file. You can do so by running\n\n```\ndas-cli config set\n```\n\n\nIn a terminal. When you have done it, run the next cell to make sure you have a config file in place.\n\n\n```\npython\n!das-cli config list\n```\n\n\nContainers for the DBMS servers and OpenFaas will be created listening on the given ports. Run the next cell to make sure any previously used containers are properly removed. If there are none, nothing will be done.\n\n\n```\npython\n!das-cli db stop\n!das-cli faas stop\n```\n\n\nNow we need to start the DBMS servers.\n\n\n```\npython\n!das-cli db start\n```\n\n\nYou can double check that the DB containers are in place listing the active docker containers.\n\n\n```\npython\n!docker ps\n```\n\n\nYou should see containers for Redis and MongoDB listening on the ports you defined in the config file.\n\nNow we need to load a MeTTa file. You can use your own file here or run the next cell to download the same file we used in [this section](#remotedas).\n\n\n```\npython\n!wget -o /tmp/.get.output https://raw.githubusercontent.com/singnet/das-metta-parser/master/tests/data/animals.metta && mv -f animals.metta /tmp\n```\n\n\nYou may want to change the path in the cell below to point to another file.\n\n\n```\npython\n!das-cli metta load /tmp/animals.metta\n```\n\n\nYou may call das-cli metta load multiple times loading different files. To clear the databases you can use das-cli db restart.\n\nOnce you're done loading the knowledge base, you need to start the FaaS server.\n\n\n```\npython\n!das-cli faas start\n```\n\n\nIt's done. At this point you should be able to point one or more remote DAS to this DAS Server, as we described [here](#remotedas)." }, { "r": "assistant", "t": "Yes, Hyperon DAS can be effectively used on a small local system, especially if you are primarily dealing with in-memory operations and small to medium-sized datasets. Here���s why it���s appropriate:\n\nLocal Operation: The local DAS mode stores the Atomspace entirely in RAM, making it fast and suitable for lightweight or development use cases without requiring a complex database setup.\n\nPersistence Options: For persistence, you can opt to integrate Redis or MongoDB as backends using the DAS Toolbox for a more robust and fault-tolerant setup. This will help handle power failures and data corruption issues, providing backup and recovery options.\n\nFlexible API: The API remains the same whether you use it locally or remotely, making it easy to scale your application without significant refactoring if you choose to move to a distributed setup later.\n\nIntegration with MeTTa: DAS supports querying and pattern matching similar to MeTTa, making it a good fit if you want to use MeTTa alongside DAS for reasoning and querying.\n\n### Next Steps:\n\nStart with a local DAS in memory (query_engine='local').Use add_node() and add_link() methods for building the Atomspace.Integrate simple persistence using Redis or MongoDB via DAS Toolbox if needed for reliability.Look into the DAS serialization options for exporting and importing Atomspace content (this may require custom handling).For small, local use, this setup should be lightweight and efficient.\n\n4o" }, { "r": "user", "t": "do you think we could do small persistence with sqlite?" }, { "r": "assistant", "t": "Yes, SQLite can be a suitable option for small, local persistence of the Atomspace using Hyperon DAS. Here���s why it may work well:\n\n### Advantages:\n\nLightweight: SQLite is embedded, meaning it doesn���t require a separate server process. This makes it a good choice for small-scale, local systems.Simplicity: Using SQLite involves minimal setup, and it���s easy to integrate into Python projects with the built-in sqlite3 module.ACID Compliance: SQLite offers ACID properties, which means it can handle transactions reliably, protecting against data corruption in case of power failures.Backup and Recovery: SQLite databases are easy to backup (via .backup() API or file copy), making it a good fit for scenarios needing quick recovery options.### Potential Challenges:\n\nConcurrency: SQLite handles concurrent access well for reads but can become a bottleneck with many simultaneous write operations. However, for small-scale, single-user scenarios, this shouldn���t be a problem.Schema Mapping: Hyperon���s Atomspace uses a graph-based structure, so you will need to map nodes and links to SQLite tables effectively, potentially using an adjacency list or edge list schema.### Implementation Strategy:\n\nUse two tables in SQLite: one for nodes (atoms) and another for links (edges).Store each atom as a row in the atoms table, with its type, name, and handle.Store links in the edges table, using foreign keys to reference source and target atoms.### Example Schema:\n\n```\nsql\nCopy code\nCREATE TABLE atoms (\n id INTEGER PRIMARY KEY,\n handle TEXT UNIQUE,\n type TEXT,\n name TEXT\n);\n\nCREATE TABLE edges (\n id INTEGER PRIMARY KEY,\n type TEXT,\n source_handle TEXT,\n target_handle TEXT,\n FOREIGN KEY(source_handle) REFERENCES atoms(handle),\n FOREIGN KEY(target_handle) REFERENCES atoms(handle)\n);\n\n```\n### Example Python Integration:\n\n```\npython\nCopy code\nimport sqlite3\n\ndef init_db(conn):\n with conn:\n conn.execute(\"CREATE TABLE IF NOT EXISTS atoms (id INTEGER PRIMARY KEY, handle TEXT UNIQUE, type TEXT, name TEXT)\")\n conn.execute(\"CREATE TABLE IF NOT EXISTS edges (id INTEGER PRIMARY KEY, type TEXT, source_handle TEXT, target_handle TEXT)\")\n\ndef add_atom(conn, handle, atom_type, name):\n with conn:\n conn.execute(\"INSERT OR IGNORE INTO atoms (handle, type, name) VALUES (?, ?, ?)\", (handle, atom_type, name))\n\ndef add_edge(conn, edge_type, source_handle, target_handle):\n with conn:\n conn.execute(\"INSERT INTO edges (type, source_handle, target_handle) VALUES (?, ?, ?)\", (edge_type, source_handle, target_handle))\n\n# Example usage\nconn = sqlite3.connect('atomspace.db')\ninit_db(conn)\nadd_atom(conn, 'human-handle', 'Concept', 'human')\nadd_atom(conn, 'mammal-handle', 'Concept', 'mammal')\nadd_edge(conn, 'Inheritance', 'human-handle', 'mammal-handle')\n\n```\n### Backup Strategy:\n\nUse SQLite���s .backup() method to create a backup file periodically.Store backups in a separate directory to safeguard against corruption.SQLite���s ease of use, small footprint, and built-in transactional support make it a practical option for small-scale persistence in Hyperon DAS.\n\n4o" }, { "r": "user", "t": "so no indication so far that das supports sql out-of-the-box" }, { "r": "assistant", "t": "Correct, there is no indication that Hyperon DAS currently supports an SQL backend, such as SQLite, out-of-the-box. The typical backend for DAS appears to be in-memory (RAM) or uses distributed NoSQL databases like Redis or MongoDB when scaled for larger or remote use cases.\n\n### Summary:\n\nLocal DAS: Operates entirely in RAM.Remote DAS: Uses external databases like Redis and MongoDB (as part of the DAS server setup).No direct SQL support: There is no explicit SQL-based backend or configuration option for DAS listed in the documentation.### What This Means:\n\nIf you want to use SQLite or any SQL backend, you would need to implement a custom storage layer that maps DAS atom operations (nodes, links, and queries) to SQL operations manually.This is doable but would involve handling:\n\nSchema design: Creating tables for nodes, links, and properties.Data mapping: Translating DAS API calls to SQL queries.Persistence logic: Ensuring ACID compliance and handling serialization/deserialization.In summary, you would need to write a custom SQLite adapter for persistence if you want to use SQL.\n\n4o" }, { "r": "user", "t": "makes sense to use a nosql db if they're already tuned for that i guess" }, { "r": "assistant", "t": "Yes, that���s the practical choice. The existing NoSQL backends (Redis, MongoDB) integrated with DAS are well-tuned for:\n\nGraph and Hypergraph Storage: Efficiently store nodes and relationships typical of knowledge graphs.Flexible Data Models: Support dynamic, schema-less data (essential for handling various atom types).Performance: Optimized for fast in-memory access (Redis) or scalable, document-based storage (MongoDB).### Recommendation:\n\nStick with the supported NoSQL backends:\n\nRedis: Ideal for small, fast, in-memory storage. Great if you need low-latency responses.MongoDB: Better for larger datasets, supports complex queries, and provides persistence across sessions.If your project starts small but might grow, you can use Redis for quick prototyping and then move to MongoDB as the data size increases, without needing major changes to your DAS configuration.\n\nThis approach avoids reinventing the wheel and leverages the optimizations already built into DAS for handling knowledge representation.\n\n4o" }, { "r": "user", "t": "das-toolbox assumes docker, i'd like to not require docker :s" }, { "r": "assistant", "t": "Avoiding Docker means you need to manually set up the dependencies that das-toolbox typically provisions with Docker containers. This includes:\n\nRedis or MongoDB for the NoSQL backend.Setting up the environment variables and configuring das-cli to use your local services.Managing service startup and persistence manually.### Steps to Run DAS Locally Without Docker\n\n#### 1. Install Redis or MongoDB Locally\n\nFor Redis, use:```\nbash\nCopy code\nsudo apt-get install redis-server\nsudo systemctl enable redis-server\nsudo systemctl start redis-server\n\n```\nFor MongoDB, use:```\nbash\nCopy code\nsudo apt-get install mongodb\nsudo systemctl enable mongodb\nsudo systemctl start mongodb\n\n```\n#### 2. Set Up Environment Variables\n\nConfigure DAS to connect to your local Redis or MongoDB instance.\n\n```\nbash\nCopy code\nexport REDIS_HOST=localhost\nexport REDIS_PORT=6379\nexport MONGODB_URI=mongodb://localhost:27017\n\n```\n#### 3. Configure DAS\n\nYou can manually edit the DAS config file in ~/.das/config.yaml:\n\n```\nyaml\nCopy code\nredis:\n host: localhost\n port: 6379\nmongodb:\n uri: mongodb://localhost:27017\n\n```\n#### 4. Load the Knowledge Base Without Docker\n\nUse das-cli commands directly:\n\n```\nbash\nCopy code\ndas-cli db restart\ndas-cli metta load /path/to/your/knowledge.metta\n\n```\n#### 5. Start the DAS Server Manually\n\nInstead of using Docker���s faas, start the necessary services manually or script their startup:\n\n```\nbash\nCopy code\ndas-cli faas start\n\n```\n#### 6. Test Your Setup\n\nEnsure everything is running properly:\n\n```\nbash\nCopy code\ndas-cli status\n\n```\n### Downsides\n\nManual installation and management of services can be more error-prone and less consistent.No automatic isolation or easy cleanup like with Docker.If these challenges are acceptable, this approach will let you avoid Docker while still using DAS effectively.\n\n4o" }, { "r": "user", "t": ">( irritating that the only standalone persistence mentioned so far is ram" }, { "r": "assistant", "t": "Yes, that's a significant drawback. The default local DAS configuration (query_engine='local') stores data only in RAM, making it volatile and unsuitable for applications that require persistence across sessions or after a crash.\n\n### Options for Persistent Storage Without Docker\n\nRedis and MongoDB (NoSQL databases):\n\nThese are well-supported backends for DAS persistence but require manual setup and configuration outside of Docker.Redis can be used for fast, in-memory data with optional disk persistence.MongoDB offers a document-based storage model that fits well with the atom-based data in DAS.SQLite or Other Local Database Options:\n\nDAS does not natively support SQL databases like SQLite.Implementing a custom persistence layer using SQLite would require translating DAS atom structures into relational tables and managing serialization/deserialization manually.Flat File Persistence:\n\nYou could serialize the DAS atomspace (using JSON or another format) and save it to a file. However, this approach may be slow and inefficient for large datasets.### Recommended Approach\n\nGiven the constraints:\n\nSet up Redis or MongoDB locally without Docker for the simplest integration with DAS, leveraging existing support and optimizations.If you prefer a smaller footprint, Redis with AOF (Append-Only File) persistence can provide durability without the complexity of MongoDB.Would you like a more detailed guide on setting up Redis/MongoDB for persistence, or do you want to explore custom solutions like SQLite?\n\n4o" }, { "r": "user", "t": "hmm i thought of redis as heavyweight, is aof lightweight?" }, { "r": "assistant", "t": "Yes, Redis with Append-Only File (AOF) persistence can be relatively lightweight, depending on configuration and usage. Here���s why AOF can be a good choice:\n\n### AOF Persistence in Redis\n\nAppend-Only Logging: Redis writes every operation to a log file (appendonly.aof). This log is appended rather than rewritten frequently, minimizing disk I/O overhead.Compact and Configurable:You can control how frequently the AOF file is synced to disk (always, everysec, or no).everysec provides a good balance between durability and performance, syncing data once per second.Lightweight Setup:Redis is easy to install (yum install redis on RHEL-based systems).It requires minimal system resources if tuned correctly (e.g., limiting memory usage with maxmemory).### Advantages\n\nFast In-Memory Performance with Durability: Unlike heavier databases like MongoDB, Redis operates primarily in-memory, making it very fast while still offering persistence.Simple Configuration: No need for complex schemas or manual serialization.Low Overhead: For small data sets or infrequent writes, the AOF file remains compact.### Disadvantages\n\nMemory Usage: Redis stores all data in RAM, so it���s not ideal for very large datasets.Write Amplification: If you have high write throughput, the AOF file can grow large, but it can be compacted (rewritten) with BGREWRITEAOF.### Conclusion\n\nIf your dataset is relatively small and performance is a concern, Redis with AOF can be a lightweight and efficient solution. It avoids the complexity of full-fledged databases like MongoDB while providing more durability than pure RAM storage.\n\nWould you like steps for setting up Redis with AOF for your use case?\n\n4o" }, { "r": "user", "t": "i think it may be complicated to perform all the system administration and source maintenance behaviors required to use hyperon-das on a single system without docker, not yet having found if there's a packaged way to\n\nhowever i do suspect if i mentioned this to the project devs they'd add something to support my use case since it sounds pretty simple\n\nbut maybe it would make sense to do manual storage for now :s" }, { "r": "assistant", "t": "Yes, opting for manual storage initially makes sense. It allows you to quickly get started without complex system dependencies like Docker or maintaining Redis/MongoDB setups. A custom, lightweight solution could involve:\n\nManual Serialization with SQLite or Flat Files:\n\nSerialize MeTTa knowledge base objects using pickle or json (if compatible).Store the serialized data in an SQLite table or simple .json file.Incremental Backups:\n\nKeep a small write-ahead log (WAL) for changes, periodically merging it with the main database.This approach is robust to power failures and doesn't require a heavy setup.Fallback Storage Options:\n\nUse an in-memory database like SQLite (:memory:) with periodic dumps to disk.This approach would mimic Hyperon's current RAM-based storage but with added persistence.### Example Approach\n\nSerialize the knowledge base periodically using a format like JSON.Write a simple wrapper to load this state at startup and save it on exit.Use SQLite as an optional enhancement if you need more structured queries or transactions.Would you like a starting example of manual serialization using Python?\n\n4o" }, { "r": "user", "t": "let's use sqlite to take advantage of its fault tolerance\ncan you update your last full interaction script?" }, { "r": "assistant", "t": "Here���s an updated version of the persistence script using SQLite for storage. This will integrate MeTTa with an SQLite database to handle knowledge persistence.\n\n### Key Changes:\n\nSQLite Storage:\n\nWe'll use a table in SQLite to store the serialized atom data.Atom data will be serialized using pickle.SQLite's journal mode will ensure fault tolerance.Load/Save Mechanisms:\n\nOn startup, it loads the knowledge base from SQLite.After each user input, changes are saved to SQLite.### Requirements\n\nMake sure you have these Python packages:\n\n```\nbash\nCopy code\npip install hyperon sqlite3\n\n```\n### Full Script\n\n```\npython\nCopy code\nimport sqlite3\nimport pickle\nfrom hyperon import MeTTa, S, V, E\nfrom datetime import datetime\n\n# Initialize MeTTa interpreter and SQLite database\nmetta = MeTTa()\nDB_PATH = \"knowledge_base.db\"\n\ndef init_db():\n \"\"\"Initialize SQLite database and create table if it doesn't exist.\"\"\"\n conn = sqlite3.connect(DB_PATH)\n cursor = conn.cursor()\n cursor.execute(\"\"\"\n CREATE TABLE IF NOT EXISTS atoms (\n id INTEGER PRIMARY KEY AUTOINCREMENT,\n timestamp TEXT,\n atom_data BLOB\n );\n \"\"\")\n conn.commit()\n conn.close()\n\ndef save_atom(atom):\n \"\"\"Serialize and save a MeTTa atom to SQLite.\"\"\"\n conn = sqlite3.connect(DB_PATH)\n cursor = conn.cursor()\n serialized_atom = pickle.dumps(atom)\n timestamp = datetime.now().isoformat()\n cursor.execute(\"INSERT INTO atoms (timestamp, atom_data) VALUES (?, ?);\", (timestamp, serialized_atom))\n conn.commit()\n conn.close()\n\ndef load_atoms():\n \"\"\"Load serialized atoms from SQLite and add them to the MeTTa space.\"\"\"\n conn = sqlite3.connect(DB_PATH)\n cursor = conn.cursor()\n cursor.execute(\"SELECT atom_data FROM atoms;\")\n rows = cursor.fetchall()\n for row in rows:\n atom = pickle.loads(row[0])\n metta.space().add_atom(atom)\n conn.close()\n\ndef run_metta_query(query):\n \"\"\"Run a MeTTa query and return the result.\"\"\"\n result = metta.run(query)\n return result\n\ndef main_loop():\n \"\"\"Main run loop for user input and MeTTa interactions.\"\"\"\n load_atoms()\n print(\"Knowledge base loaded. Enter MeTTa expressions or 'exit' to quit.\")\n\n while True:\n user_input = input(\"MeTTa> \")\n if user_input.lower() == \"exit\":\n print(\"Exiting and saving state...\")\n break\n\n try:\n # Try running the user input as a MeTTa command\n result = run_metta_query(f\"! {user_input}\")\n print(\"Result:\", result)\n\n # Save each new atom to SQLite for persistence\n for atom in metta.parse_all(user_input):\n save_atom(atom)\n\n except Exception as e:\n print(\"Error:\", e)\n\nif __name__ == \"__main__\":\n init_db()\n main_loop()\n\n```\n### Explanation:\n\nInitialization: The init_db() function sets up the SQLite database if it doesn't already exist.Persistence: save_atom() stores each new atom to the SQLite database using pickle for serialization.Loading: load_atoms() retrieves and deserializes stored atoms into the MeTTa space on startup.Main Loop: Users can enter MeTTa expressions, which are parsed and executed. Results are displayed, and new atoms are persisted.### Considerations:\n\nFault Tolerance: SQLite's journal mode (default) provides fault tolerance, making this setup more resilient to power failures.Scalability: This is intended for small to moderate data sizes. For larger datasets, a more robust database (e.g., Redis, MongoDB) may be necessary.Would you like additional enhancements or testing code?\n\n4o" }, { "r": "user", "t": "Error: cannot pickle 'hyperonpy.CAtom' object\nwould it help to make use of hyperon-das?" }, { "r": "assistant", "t": "Yes, using Hyperon-DAS would be a more fitting choice here for persistence, as it is designed for storing and querying Hyperon's atomspace efficiently and already handles serialization of the atom data. SQLite, while reliable, doesn't natively support the hyperonpy.CAtom objects, leading to the pickling error.\n\n### Why Use Hyperon-DAS:\n\nBuilt-in Serialization: DAS already manages serialization and deserialization of atoms.Query and Storage: It provides a NoSQL-like interface optimized for atom storage and retrieval.Fault Tolerance: DAS can be configured with Redis (AOF mode) for persistence, offering resilience to power failures.### Approach:\n\nLocal DAS: Use a local instance of DAS (without Docker) for storing atoms in memory or with Redis for persistence.Integration: Store new atoms in the DAS after parsing, and retrieve them on startup.I will now refactor the script to use Hyperon-DAS for atom storage and retrieval.\n\n### Updated Script Using Hyperon-DAS\n\n```\npython\nCopy code\nfrom hyperon import MeTTa\nfrom hyperon_das import DistributedAtomSpace\nfrom datetime import datetime\n\n# Initialize MeTTa interpreter and DAS\nmetta = MeTTa()\ndas = DistributedAtomSpace(query_engine='local') # Use local DAS\n\ndef load_atoms_from_das():\n \"\"\"Load atoms from DAS into MeTTa space.\"\"\"\n print(\"Loading atoms from DAS...\")\n atoms = das.get_all_atoms()\n for atom in atoms:\n metta.space().add_atom(atom)\n print(f\"Loaded {len(atoms)} atoms.\")\n\ndef save_atoms_to_das(atoms):\n \"\"\"Save a list of atoms to DAS.\"\"\"\n for atom in atoms:\n das.add_atom(atom)\n\ndef run_metta_query(query):\n \"\"\"Run a MeTTa query and return the result.\"\"\"\n result = metta.run(query)\n return result\n\ndef main_loop():\n \"\"\"Main run loop for user input and MeTTa interactions.\"\"\"\n load_atoms_from_das()\n print(\"Knowledge base loaded. Enter MeTTa expressions or 'exit' to quit.\")\n\n while True:\n user_input = input(\"MeTTa> \")\n if user_input.lower() == \"exit\":\n print(\"Exiting...\")\n break\n\n try:\n # Run the user input as a MeTTa command\n result = run_metta_query(f\"! {user_input}\")\n print(\"Result:\", result)\n\n # Parse user input and save new atoms to DAS\n parsed_atoms = metta.parse_all(user_input)\n save_atoms_to_das(parsed_atoms)\n\n except Exception as e:\n print(\"Error:\", e)\n\nif __name__ == \"__main__\":\n main_loop()\n\n```\n### Explanation:\n\nDAS Initialization: Uses a local DAS instance to store atoms in memory.Load/Save: load_atoms_from_das() retrieves atoms from DAS, and save_atoms_to_das() stores new atoms.Main Loop: User input is parsed and evaluated, and any new atoms are persisted in DAS.### Next Steps:\n\nPersistence: To enable persistence across reboots, configure DAS to use Redis with AOF mode.Testing: Run the script to ensure the atoms are correctly stored and retrieved.Would you like guidance on setting up Redis for DAS or further adjustments?\n\n4o" }, { "r": "user", "t": "i mean, does hyperon-das help with implementing sqlite persistence? or is it just extra complication?" }, { "r": "assistant", "t": "Using Hyperon-DAS for SQLite persistence would likely add unnecessary complexity. Hyperon-DAS is designed for in-memory operation or integration with NoSQL databases like Redis, MongoDB, or a distributed backend. It does not natively support using SQL databases like SQLite for its atom storage, so adding SQLite would require a significant custom adaptation layer.\n\n### Why Not Use Hyperon-DAS with SQLite:\n\nNo Direct Integration: Hyperon-DAS does not have built-in support for SQL backends. Adding SQLite would require implementing a custom storage adapter, which may duplicate existing persistence mechanisms.Serialization Issues: The current issue (cannot pickle 'hyperonpy.CAtom' object) would still need to be addressed when using SQLite, as DAS does not resolve serialization for SQL-compatible formats.Added Complexity: Hyperon-DAS introduces a layer of distributed atomspace management and query capabilities that may not be necessary for a single-system setup focused on lightweight, reliable storage.### Recommendation:\n\nImplement a manual SQLite storage solution instead. This will give you:\n\nDirect Control: Over how atoms are serialized, stored, and retrieved.Lightweight Setup: Avoids the overhead of setting up Redis or adapting DAS for SQL storage.Fault Tolerance: SQLite���s transactional guarantees offer strong durability without complex configuration.### Next Steps:\n\nCustom Serialization: Use a manual approach to serialize CAtom objects into a format like JSON or a binary blob that SQLite can store.Schema Design: Define a simple SQLite schema for storing atoms with metadata (e.g., type, name, targets).Integration: Add functions to load from and save to SQLite during your program's startup and shutdown.Would you like to proceed with an SQLite-based solution and need an implementation outline or code example for it?\n\n4o" }, { "r": "user", "t": "there are reasonable ways to do this we just have to learn the project enough to find one" }, { "r": "assistant", "t": "Agreed. Hyperon���s evolving ecosystem may contain utilities or best practices for serialization and persistence that aren't immediately obvious. Let���s break this down into steps to ensure we leverage the existing capabilities without reinventing:\n\n### Plan:\n\nSearch Hyperon-DAS or Hyperon project resources: Investigate if there are hidden or under-documented serialization tools or recommended patterns for persistence, possibly outside the main tutorials.Check with the Hyperon community: Reaching out directly or browsing recent discussions (GitHub issues, community forums) could uncover in-progress work or upcoming features supporting local persistence without Docker/Redis overhead.Fallback Plan: If nothing exists, we implement manual persistence with SQLite, using a robust serialization method (e.g., JSON or protobuf) for Hyperon objects.Would you prefer we continue with investigation or begin outlining a fallback SQLite solution?\n\n4o" }, { "r": "user", "t": "websearch to figure out how to serialise atoms\ni'll look a little more myself" }, { "r": "assistant", "t": "To implement persistence for Hyperon atoms using SQLite, you'll need to serialize and deserialize these atoms effectively. Hyperon provides a Python API that includes classes and methods for handling atoms, which can be leveraged for this purpose.\n\nSerialization and Deserialization of Atoms\n\nHyperon's Python API defines various atom types, such as SymbolAtom, VariableAtom, ExpressionAtom, and GroundedAtom. Each of these classes has methods to access their properties, which can be used to convert atoms to a serializable format (e.g., a dictionary) and back.\n\nHere's a general approach:\n\nSerialization: Convert each atom into a dictionary containing its type and relevant properties.\n\nDeserialization: Reconstruct the atom from the dictionary by identifying its type and initializing it with the stored properties.\n\nExample Implementation\n\nBelow is an example of how you might implement serialization and deserialization for Hyperon atoms:\n\n```\npython\nCopy code\nimport json\nfrom hyperon import AtomKind, S, V, E, G, GroundedObject\n\ndef serialize_atom(atom):\n if atom.get_metatype() == AtomKind.SYMBOL:\n return {'type': 'symbol', 'name': atom.get_name()}\n elif atom.get_metatype() == AtomKind.VARIABLE:\n return {'type': 'variable', 'name': atom.get_name()}\n elif atom.get_metatype() == AtomKind.EXPR:\n return {\n 'type': 'expression',\n 'children': [serialize_atom(child) for child in atom.get_children()]\n }\n elif atom.get_metatype() == AtomKind.GROUNDED:\n # Assuming the grounded object can be represented as a string\n return {\n 'type': 'grounded',\n 'object': str(atom.get_object())\n }\n else:\n raise ValueError(f\"Unsupported atom type: {atom.get_metatype()}\")\n\ndef deserialize_atom(data):\n atom_type = data['type']\n if atom_type == 'symbol':\n return S(data['name'])\n elif atom_type == 'variable':\n return V(data['name'])\n elif atom_type == 'expression':\n children = [deserialize_atom(child) for child in data['children']]\n return E(*children)\n elif atom_type == 'grounded':\n # Implement appropriate deserialization for grounded objects\n grounded_object = GroundedObject(data['object'])\n return G(grounded_object)\n else:\n raise ValueError(f\"Unsupported atom type: {atom_type}\")\n\n# Example usage\natom = E(S('add'), V('x'), V('y'))\nserialized = serialize_atom(atom)\njson_data = json.dumps(serialized)\nprint(f\"Serialized: {json_data}\")\n\ndeserialized_data = json.loads(json_data)\ndeserialized_atom = deserialize_atom(deserialized_data)\nprint(f\"Deserialized: {deserialized_atom}\")\n\n```\nStoring in SQLite\n\nOnce you have the serialized atom as a JSON string, you can store it in an SQLite database. Here's an example:\n\n```\npython\nCopy code\nimport sqlite3\n\n# Connect to SQLite database (or create it)\nconn = sqlite3.connect('atoms.db')\ncursor = conn.cursor()\n\n# Create table for atoms\ncursor.execute('''\nCREATE TABLE IF NOT EXISTS atoms (\n id INTEGER PRIMARY KEY,\n atom_json TEXT NOT NULL\n)\n''')\n\n# Insert serialized atom\ncursor.execute('INSERT INTO atoms (atom_json) VALUES (?)', (json_data,))\nconn.commit()\n\n# Retrieve and deserialize atom\ncursor.execute('SELECT atom_json FROM atoms WHERE id = ?', (1,))\nrow = cursor.fetchone()\nif row:\n deserialized_data = json.loads(row[0])\n deserialized_atom = deserialize_atom(deserialized_data)\n print(f\"Retrieved and deserialized: {deserialized_atom}\")\n\n# Close the connection\nconn.close()\n\n```\nConsiderations\n\nGrounded Atoms: Serializing grounded atoms (GroundedAtom) requires careful handling, especially if they encapsulate complex objects or functions. Ensure that the grounded objects are serializable or implement custom serialization logic for them.\n\nData Integrity: When storing serialized data in SQLite, consider implementing checks or constraints to maintain data integrity.\n\nPerformance: For large numbers of atoms or frequent read/write operations, assess the performance implications and optimize the database schema and queries accordingly.\n\nBy implementing serialization and deserialization functions tailored to Hyperon's atom structures, you can effectively persist and retrieve atoms using SQLite, facilitating data storage and retrieval in your applications.\n\nSources4o" }, { "r": "user", "t": "i have found that the metta standard library includes primitives for serialization into abstracted formats. these functions are documented in the source code and appear to be intended to be listed in an api documentation, however i do not see them listed on the website, it appears the website is lacking the api documentation.\ni think i'll build it from source and see if the api docs generate." }, { "r": "assistant", "t": "The MeTTa standard library includes serialization primitives designed to convert atoms into abstract formats. These functions are documented within the source code and are intended for inclusion in the API documentation. However, it appears that the API documentation is not currently available on the project's website. To access these details, you might consider building the documentation from the source code. This process typically involves using tools like Doxygen or Sphinx, depending on the project's setup, to generate the documentation locally. By doing so, you can explore the available serialization functions and their intended usage within the MeTTa framework.\n\nSources4o" }, { "r": "user", "t": "here is the readme. the source contains a Cargo.toml, a CMakeLists.txt, and a mkdocs.yml . is it clear how to build the documentation?\n\n![CI](https://github.com/trueagi-io/hyperon-experimental/actions/workflows/ci-auto.yml/badge.svg)\n\n# Overview\n\nOpenCog Hyperon is a substantially revised, novel version of OpenCog - which is currently at an active\npre-alpha stage of development and experimentation. One of the focuses in the Hyperon design is a successor\nto the OpenCog Classic Atomese language with clear semantics supporting meta-language features,\ndifferent types of inference, etc. What we have landed on is an \"Atomese 2\" language called MeTTa (Meta Type Talk).\n\nIn order to get familiar with MeTTa one can visit [MeTTa website](https://metta-lang.dev)\nand watch video with different [MeTTa examples explained](https://singularitynet.zoom.us/rec/share/VqHmU37XtbS7VnKY474tkTvvTglsgOIfsI-21MXWxVm_in7U3tGPcfjjiE0P_15R.yUwPdCzEONSUx1EL?startTime=1650636238000).\nThe examples of MeTTa programs can be found in [./python/tests/scripts](./python/tests/scripts) directory.\nPlease look at the [Python unit tests](./python/tests) to understand how one can use MeTTa from Python.\nMore complex usage scenarios are located at [MeTTa examples repo](https://github.com/trueagi-io/metta-examples).\nA lot of different materials can be found on [OpenCog wiki server](https://wiki.opencog.org/w/Hyperon).\nAlso see [MeTTa specification](https://wiki.opencog.org/w/File:MeTTa_Specification.pdf).\n\nIf you want to contribute the project please see the [contributing guide](./docs/CONTRIBUTING.md) first.\nIf you find troubles with the installation, see the [Troubleshooting](#troubleshooting) section below.\nFor development related instructions see the [development guide](./docs/DEVELOPMENT.md).\n\n# Using the latest release version\n\nIt is the most simple way of getting MeTTa interpreter especially if you are a Python developer.\nThe following command installs the latest release version from PyPi package repository:\n```\npython3 -m pip install hyperon\n```\n\n\nAnother way is using released Docker image:\n```\ndocker run -ti trueagi/hyperon:latest\n```\n\n\nAfter installing package or starting Docker container run MeTTa Python based\ninterpreter:\n```\nmetta-py\n```\n\n\nUsing Docker you can also run Rust REPL:\n```\nmetta-repl\n```\n\n\n# Using latest development version\n\n## Docker\n\nA docker image can be used as a ready to run stable and predictable development\nenvironment. Docker 26.0.0 or greater version is required to build image\nmanually.\n\nBuild Docker image from a local copy of the repo running:\n```\ndocker build -t trueagi/hyperon .\n```\n\n\nOr build it without local copy of the repo running:\n```\ndocker build \\\n --build-arg BUILDKIT_CONTEXT_KEEP_GIT_DIR=1 \\\n -t trueagi/hyperon \\\n http://github.com/trueagi-io/hyperon-experimental.git#main\n```\n\n\nUse --target build option to create an image which keeps the full build\nenvironment and can be used for developing interpreter:\n```\ndocker build --target build -t trueagi/hyperon .\n```\n\n\nIf the docker image doesn't work, please raise an\n[issue](https://github.com/trueagi-io/hyperon-experimental/issues).\n\n## Manual installation\n\n### Prerequisites\n\n* Install the latest stable Rust, see [Rust installation\npage](https://www.rust-lang.org/tools/install). Make sure your\nPATH variable includes $HOME/.cargo/bin directory after installing\nRust (see the Notes at the installation page).\n\n Requirements for building C and Python API\n * Python3 and Python3-dev (3.7 or later)\n * Pip (23.1.2 or later)\n * GCC (7.5 or later)\n * CMake (3.24 or later)\n\n To support Git based modules (enabled by default):\n * OpenSSL library\n * Zlib library\n\n* Install cbindgen:\n```\ncargo install --force cbindgen\n```\n\n\n* Install Conan and make default Conan profile:\n```\npython3 -m pip install conan==2.5.0\nconan profile detect --force\n```\n\n\n* Upgrade Pip to the required version:\n```\npython3 -m pip install pip==23.1.2\n```\n\n\n# Build and run\n\n## Rust library and REPL\n\nBuild and test the Rust binaries:\n```\ncargo test\n```\n\n\nThe experimental features can be enabled by editing\n[Cargo.toml](./lib/Cargo.toml) file before compilation or by using --features\n[command line option](https://doc.rust-lang.org/cargo/reference/features.html#command-line-feature-options).\nSee comments in the [features] section of the file for the features\ndescriptions.\n\nRun examples:\n```\ncargo run --example sorted_list\n```\n\n\nRun Rust REPL:\n```\ncargo run --bin metta-repl\n```\n\nYou can also find executable at ./target/debug/metta-repl.\n\nTo enable logging during running tests or examples export RUST_LOG\nenvironment variable:\n```\nRUST_LOG=hyperon=debug cargo test\n```\n\n\nRunning benchmarks requires nightly toolchain so they can be run using:\n```\ncargo +nightly bench\n```\n\n\nGenerate docs:\n```\ncargo doc --no-deps\n```\n\nDocs can be found at ./target/doc/hyperon/index.html.\n\n## C and Python API\n\nSetup build:\n```\nmkdir -p build\ncd build\ncmake ..\n```\n\nTo run release build use -DCMAKE_BUILD_TYPE=Release cmake flag.\n\nBuild and run tests:\n```\nmake\nmake check\n```\n\n\n## Running Python and MeTTa examples\n\nIn order to run examples you need to install the Python module. Please ensure\nyou built [C and Python API](#c-and-python-api) first. Then execute the\nfollowing command in the top directory of repository:\n```\npython3 -m pip install -e ./python[dev]\n```\n\n\nAfter this one can run unit tests within python directory using pytest:\n```\npytest ./tests\n```\n\n\nOne can run MeTTa script from command line:\n```\nmetta-py ./tests/scripts/