[SUMMARY] Monthly Video Update - February 2021
Transcript of the Tau-Chain & Agoras Monthly Video Update – February 2021
I recently joined the IDNI advisory board as an expert in computational logic. From the foundational point of view I’ve done a lot of foundational papers in computational logic applied to information system databases, artificial intelligence, knowledge representation, semantic technologies, natural language semantics, natural language understanding, data integration and query answering. I also applied a lot of these techniques in industry applications. I’ve done many european projects with industries and many consultancies with various companies, small to medium, to large enterprises. It’s interesting the research I’m doing; I’m always trying to study and understand the real foundation but also to find how these foundations can be applied in real scenarios and vice versa. Trying to understand the real needs of industries and real problems of users and then try to find the logic based, foundation solutions that this problem may have without compromises. Regarding this, I have been working quite hard in this middle ground so I have to be understood and appreciated by practitioners and politicians. Typically it’s a hard job because politicians will not believe that real world problems are interesting and real world companies probably quite often, quite correctly don’t believe they need foundational solutions that they can solve by using a quick hack. IDNI in this sense is completely different. Their approach to solving this idea of the platform for communication and economy of knowledge has to be well founded to their view which I completely agree with and should be logic based in the sense that whatever kind of support decision and evolution this platform would have should be always guaranteed to be correct and satisfying truth principles. I find this idea very challenging but also quite focussed. The Team with Ohad are very “top notch”. There are software engineers, developers, logicians and the discussions I’m having with them are really interesting and they are leading us towards a definition of a better and well founded TML language. So I believe that, not only this research but the application itself will have a bright future and we will try and find it together.
I’ve worked on the interpreter changes which I mentioned at the end of last month. Last month I began work on a version of the entire interpreter that supports parallelisation. This month I applied the full upgrade across the board. This is quite involved and I have to make all the directives. After I update the schema for quoted programs I have to update all the directives in order to support the new schema. So that’s what I did. Now this interpreter uses lower arity relations and it was promising to be faster. Another thing I did was change the design of the interpreter. Before it was using slow synchronisation rules. To make it faster I implemented a fixed point check inside the interpreter. On the interpreter front we’ve had a little bit of faster performance. Another thing I have worked on this month is applying program transformations to quarter programs so this was done successfully although right now there haven't been any speed improvements. I’ve also worked on investigation of other semantics for interpreting TMl which may be a little bit more user friendly but yet equivalent to the partial fixed point semantics that we currently have. This investigation stemmed from my work with the interpreter. A minor investigation was made whether we can use the domain over which pfp works so the interpreter uses certain rules which cause relations to alternate and I realised that if we were to use f4pfp or another variation we’d be able to also get the program to reach a fixed point. This is an alternative to other things we are considering as well. Other things I’ve been working on include, general testing, documentation, bug fixing and reviews of the code. To that end I’ve had a look at the proofs for the conjunctive query algorithm to see if maybe there was a way to shortcut certain operations within the algorithm and so obtain faster performance. That has led me to some articles where they’ve got faster conjunctive query algorithms. I’ll be looking into those over the next month and hope to start working with binary decision diagrams and the BDD solver for TML engine.
I had the pleasure to implement the QBF solver and i did that by working through an introductory tutorial on BDD’s by a professor called Anderson and this basically introduces, in the core sense, more or less efficient implementation of BDD’s and then with some improvements I worked out an implementation of a QBF solver based on that. Then I went over this implementation with Ohad and we optimised it further. In the end there were results visible to reducing runtime from five minutes to 47seconds so that was quite a pleasure to see the improvement to optimised code. I’ve spent time on studying C++ and database theory in order to really grasp the theory behind TML.
I’ve been working on the design flows of agoras live version one and version two and we finished the design of the home page whilst adding live sessions. Last week we have been working on how we can display categories and how the user can navigate the Agoras Live platform. I’ve also been working on despacting all the mobile views of the missing screens and making some flows about how the user can deposit, withdraw or buy Agoras through the platform.
I’ve been mostly testing the Agoras Live platform. I’ve covered every aspect of the platform which some outstanding work left on the mobile view side of the platform. Everything else is pretty much working. I had to refactor the code of Agoras live front end because currently we are looking for a front end developer to implement more features for agoras live and move to Mo’az’s created design with some additional features . We are ready to pass the platform to a professional tester to help me to find any outstanding issues I was unable to locate. I’ve also implemented the feature of restoring and backing encrypted user storage. See here:
This month I’ve been working on aggregating crypto focused youtubers but so far the offers we’ve received were highly overpriced so we still need to find the right youtubers to work with. Ideally we are looking to onboard three to five youtubers for long term partnerships. So if you have your favourite youtuber we are definitely open for recommendations. Together with the team we’ve been on various marketing agency calls looking to find the right partner for our marketing activities in the future and we’ve come to a conclusion with a partner and hopefully very soon you’ll feel the impact of that partnership. Fola and I have been on many calls with designers trying to find the world's best designers to collaborate with to do the website and rebranding. We are still in the process of doing that and hope to make a decision soon. We’ve been looking alot for copywriters which relates to the marketing front. Ideally we would have an inhouse copywriter to communicate the project to the public. I’ve started fine tuning the presentation and started work on an explainer video. I’ve done a lot of community support related to the ERC-20 swap. It’s an ongoing swap where users over whitebit can convert their omni based agoras to ERC-20 agoras. It is an automatic swap so you just deposit your omni based tokens agoras. Once deposited it automatically converts to ERC-20 agoras and you can only withdraw them as such. Community support includes the listing we announced on Bittrex where users who previously held Agoras on the bittrex exchange will get their Agoras balances restored including US citizens. US users will be able to withdraw but not trade due to regulatory reasons. We’ve also announced the launch of the Tau Supporter Program to be launched on March 15th. We’ve allocated 200,000 Agoras over a three month period where you can basically accumulate points by completing challenges and thereby climbing up the leaderboard and earning your share of the leaderboard.Apply here:
Tau Supporter Program Info:
The community member of the month is “Miao Miao” for his long term support of the project and his active participation on telegram this month so, Thank you!
We are looking to wrap up the micropayment channel or the off chain payments with our ethereum / ERC-20 in order to support Agoras Live payments. That is in an advanced stage. We already have an off chain payment micropayment channel working on ethereum but we are dealing right now with impractical high gas fees making Agoras Live’s use case not very compatible with this current setup so we have launched development to the agoras live payments through the Binance Smart Chain where all the code we have put together for Ethereum is fully compatible. We still need to figure out several infrastructural and integration tweaks to make it functional. So we are decreasing the fees from $20-$40 over Ethereum to 20-40 cents on Binance Smart Chain. Next month I hope to start on TML’s Second Order Logic support based on Ohad’s Algorithm.
Fola: I’ve been involved in the hiring discussions that have happened so far this month. Web apps developer, front end developer, a blockchain developer, a copywriter and we’ve also settled on a marketing firm we’re talking to now. There is alot going on at IDNI now and we hope to settle on a Graphic Designer soon and finish off the website. Bitrex have swapped the Omni tokens to ERC-20 Tokens and we hope to have a listing date from them soon. Please don’t send any of the omni tokens to them just yet. We are currently negotiating how to do an ongoing swap with them. Once I get more details we’ll update the article. We are also working on the UKex Swap and we hope to have that finished pretty soon, wrapped up next week including a listing date.
I have been working on the swap to Bittrex and getting back there was a long waiting goal that Fola set to achieve and he has achieved it successfully. I have worked on algorithms for Second Order Logic which can serve as an initial algorithm. Finally we will have some solver of Second Order Logic. It has a lot of room for improvement which is very good news. It has the potential to improve. There are already some ways we can see to improve it so it will improve with time. With the discussions with the academic panel we have discussed lifting the opinion map to have a more fundamental role in the system and by that bypass a certain logical difficulty in setting of laws of changing the laws and so on. I was also thinking about more ideas of how to optimise TML speed. Not only faster, but faster sooner. Once way to go is to support more back end so right we have the ability back end that we implemented. But perhaps we can allow the user to choose other backends like Souffle or SQLite, etc. And by that on certain tasks that BDD’s are not good at, at least not yet, opermising BDD’s is an ongoing task which contains many non trivial tasks. Maybe we will be able to by allowing other existing backends to have TML faster, namely sooner, and it will hopefully release the bottleneck of TML not being fast enough which stops us from achieving certain goals. I have also been thinking about how to start working on mainnet. How to have a blockchain implemented in TML. We will hire personnel for this matter. There is a lot to decide there but the process has been started
Q&A:Q: Lucca? What ideas are you working on that you think could improve Tau and or Agoras. What do you hope to achieve?
Lucca: The long term goal is to work on model finding for second order logic formulas or even higher order formulas. This is particularly important for Tau because the later vision is to just write a specification of something, of what the program should do, and then you want to extract from these specifications, a program. This is nothing else than model finding in a higher order domain and because a program that is formed from specification can be seen as a model for these specifications so to speak. This is one point let’s say, a necessity for the Tau environment to have a higher dimensional model finding. Another thing is that with higher dimension model finding you can create databases to queries that are higher dimensional in a logical sense that use higher order quantifiers. So as mentioned, Ohad came up with an algorithm that does the job for Second Order Logic so one of the questions is can we do it more efficiently. Are there other ways over the next few months. Let’s see how these thoughts evolve. I will dive into the theory and possibly make this even better. I also come from the background of algorithmic model theory and one of the tasks is to find a model for a given formula or a set of formulas and that these are higher order logic formulas, you want to be able to generate a model automatically. Let's say I want a group that has order 20, for example. Current model finding technology is not able to generate this. This is not specifically written for groups but it in the general sense so one of the goals to also examine if we can push in a competitive setting model finding to a new level.
Q: When will the token swap be available for token holders in the United States?
Fola: Bittrex have swapped the Omni tokens for ERC-20 tokens You can’t trade however due to regulatory reasons as a US citizen but you can withdraw. We don’t know when it is coming for other US token holders. The best we can do is work on it.
Q: Since performance and scaling are going to be a challenge for Tau and Agoras, what are some ideas you have for improving performance and how much performance is enough to be useful ?
Ohad: We have a long list of optimisations for the BDD layer and most, if not all, of those optimisations are non trivial otherwise we’d do it beforehand and that’s an ongoing effort. We can also support backends for TML not only for BDDs and by that release the performance bottlenecks and release sooner. How do we know if it’s fast enough. Well it should be fast enough to fulfill its goals in a reasonable time. It’s goal is to serve as a language translator so it needs to be able to translate documents in some time reasonable to the user. For now it can’t pass itself in a reasonable time. When we see progress on this front we can say that TML is fast enough to continue working with it.
Q: How will IP (intellectual property) be handled over something like Agoras? e.g. will Agoras rank who contributed the knowledge first and reward accordingly to this, or are there some better ideas?
Ohad: This is up to the users and I don’t mean in the sense that the user defines the system. I mean whether to trust a certain knowledge provider is up to you just like in everyday life. Tau cannot say who is right and who is wrong, who has good na who has bad knowledge. It is up to the users by traditional means.
Q: Will Agoras Live be ported to TML? What are the advantages and disadvantages of doing this?
Ohad: It will be ported to TML in the future. It will use the smart contract mechanism that comes with Tau. I don’t see any apparent disadvantage. Advantages, the ability to have formalised contracts mainly and also the fact that it is controlled by its users so they can control how the platform behaves.
Q: In layman terms, what does “size of universe” mean in the tau context, and what is its relevance?
Ohad: The universe in logical terms. Another name of it is Logical discourse. It is basically the vocabulary that the TML program works on and if you have a certain size of vocabulary it is related to the number of bits you need to encode each term in vocabulary. The number of bits is then reflected in the binary decision diagrams. The higher the number of bits the exponentially larger potentially the BDDs can get. Therefore it is very important to maintain the smallest number of bits as possible. Also from complexity theory considerations, the time it takes for a TML program to run is as a function of the size of the universe so it is yet another reason why we need to be very strict about it’s size
Q: In what cases will computation be done directly in the Tau runtime? In what cases will computation be done in other binaries synthesised and compiled by Tau? WIll it make sense to synthesise and compile just in time binaries per purpose? Is this how tauchain will update blockchain node binaries?
Ohad: It makes sense to compile code to binaries. From the point of view of the network it doesn’t matter how a node reaches a computed whatever it wants to send over the network. It doesn’t matter whether it uses methods of interpretation or compilation but the code of Tau itself itself to be distributed over the network should not be distributed as binaries. It should be distributed as logical formulas that users can reason over. That's the main advantage of the Tau technology.
Q: What will be the first languages defined for Tau?
Fola: The first language over Tau will be TML and it’s entirely up to the users to write the translators to other languages for other users to begin to use those languages.
Ohad: It is first a foremost for knowledge representation languages. TML is a logical language but it is not suitable for knowledge representation so the intention is to write translators for knowledge representation languages that admit the logical requirements of Tau using TML
Q: Languages have different scopes and expressibility. How can they be translated, or can they? For example, can controlled English be translated to Python? C++ to Rust? Is this problem for Tau or only a problem for humans?
Ohad: Yes it is a problem for Humans. Humans will have to define the translation process. Tau can not just look at a language and guess how to translate.
Q: What do you see as the progression of use-cases as Tay Matures? E.g. Will the first application of tay be to optimise tau. Then to define a language, then to synthesize other programs? Or define C++, define controlled english, discuss Tau and bootstrap tauchain? Or is it still unclear and too early to tell?Ohad: the progression of the use cases will be to have large scale discussions in general. Them, In particular, what Tau’s next version should be like. Then the system will automatically update itself as well as the economics of knowledge. The ability to trade knowledge.
Q: Does the team foresee Tau-specific hardware for faster and or embedded processing?
Ohad: Not for now but maybe we will see in the future.Q: If agoras Live is being built on top of mainnet, being the first application using Tauchain, will its release need to be postponed until the mainnet is ready?
Ohad: No we don’t postpone the release of Agoras Live as it is developed separately to mainnet and we don’t build it on TML. In future we will integrate it into the whole Tau technology,
Q: What kind of improvements or new features can we expect if Agoras Live is built on top of the mainnet instead of being an independent application?
Ohad: Users will be able to control how the program behaves and change it. They will have the ability to have richer contracts and contracts that can be reasoned over.
Q: What is the fundamental reason we still craft code by hand and not by program synthesis? Does tau solve those fundamental problems?
Ohad: The main reason is lack of tools and lack of good tools especially from a performance point of view and yes this is a major problem that Tau intends to solve.