diff --git a/docs-sphinx/index.rst b/docs-sphinx/index.rst index 19d912b311..4ac603fa13 100644 --- a/docs-sphinx/index.rst +++ b/docs-sphinx/index.rst @@ -24,6 +24,7 @@ Welcome to AElf's official documentation! Running a side chain Running AElf on the cloud Smart Contract Developing Demos + Building an indexer plugin .. toctree:: diff --git a/docs-sphinx/tutorials/indexer/AeFinder-introduction.rst b/docs-sphinx/tutorials/indexer/AeFinder-introduction.rst new file mode 100644 index 0000000000..ae4998a56e --- /dev/null +++ b/docs-sphinx/tutorials/indexer/AeFinder-introduction.rst @@ -0,0 +1,84 @@ +Introduction +============ + +"Indexing" refers to the process of synchronizing block data from AElf blockchain +nodes to a locally centralized environment for storage. + +This system then provides various data interfaces. Whether you are a dApp +developer looking to build exciting applications on the AElf blockchain or just +curious about how the AElf node's scanning system operates, this document is suitable +for you. + +Overall Workflow +---------------- + +The overall workflow of the indexer, starting from the AElf nodes, pushing block +data to the DApp, getting the desired on-chain data. + +.. image:: indexer-overall.png + :alt: Overall Workflow + +1. AElf Node Push +~~~~~~~~~~~~~~~~~ + +The AeFinder, which is the indexer of the AElf eco-system, enhances functionality for AElf nodes, enabling automatic asynchronous +or synchronous transmission of historical or latest block information to the RabbitMQ +message queue. The AeFinder's storage module then receives and processes the +relevant block data. + +2. Indexer Storage +~~~~~~~~~~~~~~~~~~ + +Upon receiving block data from RabbitMQ, the Indexer storage module identifies and +processes the data, identifying any forked blocks. During this process, some auxiliary +data is stored in MongoDB, but ultimately, all block data (excluding forks) is stored +in Elasticsearch. The data is organized into different indices based on the structures +of Block, Transaction, and Logevent. + +3. Indexer Subscription and Push +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +**Subscription:** +Consumers of the AeFinder can initiate subscriptions for block-related information +through the subscription API. Currently, subscriptions primarily support block height, +block transactions, and block transaction event, these three dimensions, especially subscribing based +on transaction events, which is applicable in various scenarios. After making a subscription +API request with a client ID, a subscription version is returned, which is noted and later +written into the client interface plugin developed subsequently. + +**Push:** +Upon receiving a subscription request, the AeFinder subscription and push module +fetches data from Elasticsearch based on the subscription requirements and streams +the data to the Kafka message queue. + +4. Indexer Client +~~~~~~~~~~~~~~~~~ + +The Indexer client receives subscribed block data from Kafka and passes it to the +interface plugin for processing. Interface plugins are developed separately and +handle various transactions and events within blocks, storing the processed result +set in a specific Elasticsearch index. Based on requirements, GraphQL interfaces are +defined, utilizing Elasticsearch as a data source to develop business logic and expose +the data externally. + +5. DApp Integration +~~~~~~~~~~~~~~~~~~~~ + +Within the DApp, desired data can be requested by directly calling the GraphQL interface +exposed by the AeFinder client, based on the client ID. + +Why Indexer Is Needed +---------------------- + +The role of the Indexer in the AElf blockchain is crucial. It synchronizes block +information from AElf nodes to a local ElasticSearch environment, providing developers +with convenient and efficient data access interfaces. By enabling real-time or historical +data asynchronous or synchronous transmission, the Indexer enhances the functionality of +AElf nodes, allowing them to handle block information more flexibly. Moreover, it offers +robust data querying and subscription mechanisms. This mechanism enables decentralized +application (DApp) developers to easily subscribe to and retrieve relevant information +about specific blocks, transactions, or events, facilitating the development of applications +on the AElf blockchain. With the Indexer, developers can establish indexes, query, and analyze +blockchain data more effortlessly, improving DApp development efficiency and providing +convenience for broader blockchain data utilization. + diff --git a/docs-sphinx/tutorials/indexer/build-an-AeFinder-plugin.rst b/docs-sphinx/tutorials/indexer/build-an-AeFinder-plugin.rst new file mode 100644 index 0000000000..b221865d9f --- /dev/null +++ b/docs-sphinx/tutorials/indexer/build-an-AeFinder-plugin.rst @@ -0,0 +1,696 @@ +Build Indexer +============= + +Step 1. Subscribe block information +------------------------------------ + +**Obtaining Authentication Authorization** + +The demand side (DApp) needs to contact the indexer system administrator first to get the client ID and key assigned by the indexer, which looks like this: + +.. code-block:: json + + { + "ClientId": "Sample_DApp", + "ClientSecret": "1q2w3e*" + } + +Each DApp that requires an indexer should apply for a corresponding client ID and key, which will be valid for a long time. + +Upon obtaining the client ID and key pre-allocated by the AeFinder, you can initiate an authentication authorization request to the Indexer, obtaining an authentication token (Token) upon successful verification. + +**Post request address** : ``http://URL:{Port}/connect/token`` + +The URL and port correspond to the server address where the AeFinder AuthServer service is located. Please contact AElf to get it. + +**Request Body** (x-www-form-urlencoded): + +.. code-block:: text + + grant_type:client_credentials + scope:AElfIndexer + client_id:Sample_DApp + client_secret:1q2w3e* + +**Response**: + +.. code-block:: json + + { + "access_token": "eyJhbGciOiJSUzI1NiIsImtpZCI6IkY1RDFFRjAzRDlEMEU2MTI1N0ZFMTc0ODVBRkI2RjUzNDc0QzJEQjkiLCJ4NXQiOiI5ZEh2QTluUTVoSlhfaGRJV3Z0dlUwZE1MYmsiLCJ0eXAiOiJhdCtqd3QifQ.eyJvaV9wcnN0IjoiQUVsZkluZGV4ZXJfREFwcCIsImNsaWVudF9pZCI6IkFFbGZJbmRleGVyX0RBcHAiLCJvaV90a25faWQiOiI5MTljZmYzOC0xNWNhLTJkYWUtMzljYi0zYTA4YzdhZjMxYzkiLCJhdWQiOiJBRWxmSW5kZXhlciIsInNjb3BlIjoiQUVsZkluZGV4ZXIiLCJleHAiOjE2NzM3OTEwOTYsImlzcyI6Imh0dHA6Ly9sb2NhbGhvc3Q6ODA4My8iLCJpYXQiOjE2NzM3ODc0OTZ9.aABo_opBCiC3wePnIJpc6y3E4-nj50_WP93cYoYwxRGOxnXIq6LXz_r3-V_rmbzbxL3TbQvWQVuCcslF_rUJTMo6e6WC1ji5Ec9DtPpGbOOOvYALNhgOiP9p9TbzVubxHg7WdT6OEDLFihh4hsxtVBTK5_z8YXTa7fktLqve5Bd2eOpjb1TnQC7yZMwUvhnvQrjxuK9uRNxe9ODDt2EIcRhIQW5dQ-SDXpVoNfypY0GxQpuyHjwoJbtScJaX4HfHbh0Fis8EINOwpJr3-GKtcS6F4-t4FyOWMVW19y1_JAoCKTUlNy__htpdMOMQ-5nmFYYzlNr27LSOC_cylXz4lw", + "token_type": "Bearer", + "expires_in": 3593 + } + + +The access_token is the authentication token. It is required when making specific subscription requests to the Subscription API. + +**Send Subscription** + +By sending a request to the Subscription API, you inform the Indexer system that your DApp needs to subscribe to specific blocks, transactions, or events. Subsequently, when the interface plugin subscribes to the AeFinder, the Indexer system filters out the specified blocks/transactions/events and pushes them to the corresponding interface plugin. + +**Post request address** : ``http://URL:{Port}/api/app/subscription`` + +Request Mode:raw + +Request Header: Authorization Bearer {access_token} + +**Request Body**: + +.. code-block:: json + + [ + { + "chainId": "tDVV", + "startBlockNumber": 48532699, + "onlyConfirmedBlock": false, + "filterType" : "Transaction", + "subscribeEvents": [ + { + "contractAddress": "{contract_address}", //the address of the contract you deployed + "eventNames": [ + "SampleEvent" //the event defined in the contract you want to index + ] + } + ] + } + ] + +Parameters Explanation: + ++----------------------+----------------------------------------------------------------+ +| Parameter | Description | ++======================+================================================================+ +| ChainId | The AElf chain ID to subscribe, e.g., AElf mainchain is AElf, | +| | sidechain is tDVV. | ++----------------------+----------------------------------------------------------------+ +| StartBlockNumber | The initial push block height for subscription, usually this | +| | will be the block height of the contract deployed. | ++----------------------+----------------------------------------------------------------+ +| OnlyConfirmedBlock | Whether only confirmed blocks are subscribed or not. | ++----------------------+----------------------------------------------------------------+ +| FilterType | The type of block data to be subscribed. Currently, the indexer| +| | system categorizes a complete block data into three levels of | +| | data structures: Block, Transaction, and Logevent. For details,| +| | refer to the Scanning Data Structure Example. | ++----------------------+----------------------------------------------------------------+ +| SubscribeEvents | The subscribed events. | ++----------------------+----------------------------------------------------------------+ + + +After successfully calling the API, the version of subscription will be returned, e.g. ``932e5a54b6044e049cf939607b248d89`` + +Note down this version number, as it will be used in the development of the client interface plugin in Step 2. + +**Get Existing Subscription** + +If you need to view all the initiated subscription information, you can query it through the following API. + +**Get request address** : ``http://URL:{Port}/api/app/subscription`` + +Request Header: Authorization Bearer {access_token} + +**Response**: + +.. code-block:: json + + { + "currentVersion": { + "version": "932e5a54b6044e049cf939607b248d89", + "subscriptionInfos": [ + { + "chainId": "tDVV", + "startBlockNumber": 48532699, + "onlyConfirmedBlock": false, + "filterType": 1, + "subscribeEvents": [ + { + "contractAddress": "{contract_address}", + "eventNames": [ + "SampleEvent" + ] + } + ] + } + ] + }, + "newVersion": null + } + +**Stop Running Subscription** + +**Post request address** : ``http://URL:{port}/api/app/block-scan/stop?version={subscription_version}`` + +This API is used to stop running subscriptions. + +Request Header: Authorization Bearer {access_token} + +**Replace Running Subscription by New Subscription** + +**Post request address** : ``http://URL:{port}/api/app/block-scan/upgrade`` + +This API is used to replace current subscription version by new version. After a new subscription is created, +it will be at "newVersion". When it's ready to use, this API is required to be called to upgrade it to currentVersion. + +.. image:: subscription_version.jpeg + :alt: Subscription Version + +Request Header: Authorization Bearer {access_token} + +**Update Running Subscription** + +**Post request address** : ``http://URL:{Port}/api/app/subscription/{Version}`` + +Request Mode:raw + +Request Header: Authorization Bearer {access_token} + +**Request Body**: + +.. code-block:: json + + [ + { + "chainId": "AELF", + "startBlockNumber": 54541, + "onlyConfirmedBlock": false, + "filterType": "LogEvent", + "subscribeEvents": [ + { + // update content + } + ] + } + ] + +Step 2. Indexer Plugin Development +------------------------------------ + +Having understood the working principle of the AeFinder, you will find that to +enable a DApp to request data from the AeFinder, the main task is to develop a +client interface plugin. + +.. image:: indexer-plugin.png + :alt: Indexer Plugin + +The following will use a sample as an example to explain in detail how to develop a client interface plugin. + +A sample indexer project repo: ``https://github.com/xibo1/aelf-indexer-demo/tree/dev`` + +A completed indexer project repo: ``https://github.com/Portkey-Wallet/bingo-game-indexer`` + +**Development Environment** + +.Net 7.0 + +**Building the Project Skeleton** + +1. Build a .Net 7.0 empty project + +2. Create 2 main folders, one is src , another is test + +The src folder will contain the code of the indexer plugin, the test folder will contain the code of unit test for the indexer plugin. + +3. Add the required package + +Under src folder, create project file .Indexer.csproj. Import AElfIndexer.Client package. The latest version of this package is "1.0.0-28" + +Here is the sample code of it: + +.. code:: xml + + + + net7.0 + enable + + + + + + +4. Build the src skeleton + +Under src folder, create these subfolders: Contract, Enities, GraphQL, Handler and Processors +These folders will contain different parts of the indexer plugin. + +Contract: This folder will contain the generated files of your contract, which end with c.cs and g.cs, e.g. HelloWorldContract.c.cs and HelloWorldContract.g.cs. These will be used for telling indexer plugin the event datastructure defined in the contract. + +Entities: This folder will contain the files defining datastructure which will be used for storing and querying data. + +GraphQL: This folder will contain the files defining the interface of querying data from storage and the datastructure which will be used for the GraphQL interface. + +Handler: This folder will contain handlers about how to handle block data. + +Processors: This folder will contain processors. These processors are the specific logic about how to process the indexed data and store data into storage. + +5. Add contract files to the project + +Move the generated contract files to ``src/Contractfolder``. These generated contract files end with c.cs and g.cs. They can be found under the path ``/Protobuf/Generated`` + +6. Define datastructures + +After the interface plugin receives the corresponding block information data from the AeFinder Client, it needs to process the block data for each height according to the custom code logic. The processed results should be updated and stored in the index library. In general, behind each interface, there is a corresponding index library that stores its result set. + +Currently, the AeFinder system supports using ElasticSearch as the medium for persistent storage of index libraries. However, the entity class for the index library structure of the result set needs to be defined manually, inheriting from AElfIndexerClientEntity and implementing the IIndexBuild interface. + +This entry refers to the data structure utilized when storing information into ElasticSearch after processing the data obtained through AeFinder. + +Create a file IndexEntry.cs under src/Entities folder. Here is the sample code of it: + +.. code:: c# + + using AElf.Indexing.Elasticsearch; + using AElfIndexer.Client; + using Nest; + + namespace Sample.Indexer.Entities + { + public class SampleIndexEntry : AElfIndexerClientEntity, IIndexBuild + { + // Define it according to your own usage requirements. + [Keyword] + public string FromAddress { get; set; } + + public long Timestamp { get; set; } + + public long Amount { get; set; } + + // Define it according to your own usage requirements. + } + } + +7. Creating the GraphQL query interface + +This interface will serve as the user's interface for querying data. It should include the logic based on which GraphQL returns data to the user when querying. This will be talked about in GraphQL interface development section. + +Create a file Query.csunder src/GraphQL. Here is the sample code of it: + +.. code:: c# + + using AElfIndexer.Client; + using AElfIndexer.Grains.State.Client; + using GraphQL; + using Nest; + using Sample.Indexer.Entities; + using Volo.Abp.ObjectMapping; + + namespace Sample.Indexer.GraphQL + { + public class Query + { + public static async Task SampleIndexerQuery( + [FromServices] IAElfIndexerClientEntityRepository repository, + [FromServices] IObjectMapper objectMapper, QueryDto dto) + { + // Define it according to your own usage requirements. + var infoQuery = new List, QueryContainer>>(); + if (dto.PlayerAddress == null) + { + return new SampleResultDto(); + } + infoQuery.Add(q => q.Terms(i => i.Field(f => f.FromAddress).Terms(dto.PlayerAddress))); + var result = await repository.GetSortListAsync( + f => f.Bool(b => b.Must(infoQuery)), + sortFunc: s => s.Descending(a => a.Timestamp)); + var dataList = objectMapper.Map, List>(result.Item2); + var queryResult = new SampleResultDto + { + Data = dataList + }; + return queryResult; + // Define it according to your own usage requirements. + } + } + +8. Create the GraphQL structure class + +Create a file IndexerSchema.cs under src/GraphQL. Here is the sample code of it: + +.. code:: c# + + using AElfIndexer.Client.GraphQL; + + namespace Sample.Indexer.GraphQL + { + public class IndexerSchema : AElfIndexerClientSchema + { + public IndexerSchema(IServiceProvider serviceProvider) : base(serviceProvider) + { + } + } + } + +9. Define datastructure for Query + +Besides the schema and query logic, datastructures used in Query also need to be defined. At least 2 datastructures +are needed. One is QueryDto, which is the input for querying data, another one is ResultDto, which is the output. +Create a file Dto.csunder src/GraphQL. Here is the sample code of it: + +.. code:: c# + + using GraphQL; + using Volo.Abp.Application.Dtos; + + namespace Sample.Indexer.GraphQL + { + public abstract class QueryDto: PagedResultRequestDto + { + + [Name("playerAddress")] + public string PlayerAddress { get; set; } + } + + public class ResultDto + { + public List Data { get; set; } + } + + public class TransactionData + { + public string FromAddress { get; set; } + + public long Timestamp { get; set; } + + public long Amount { get; set; } + } + } + +10. Build processors + +Depending on the subscribed block information type (Block/Transaction/LogEvent), the processing methods +for each may vary slightly. + +Transaction + +Processing transaction structure type block transaction data mainly involves handling TransactionInfo. +To do this, you need to inherit from the ``AElfLogEventProcessorBase`` class, and override and implement its +``GetContractAddress`` and ``HandleEventAsync`` methods. + +.. code:: c# + + public abstract class SampleTransactionProcessor : AElfLogEventProcessorBase + { + protected readonly IAElfIndexerClientEntityRepository SampleTransactionIndexRepository; + protected readonly IAElfIndexerClientEntityRepository SampleIndexRepository; + protected readonly ContractInfoOptions ContractInfoOptions; + protected readonly IObjectMapper ObjectMapper; + + protected SampleTransactionProcessor(ILogger logger, + IAElfIndexerClientEntityRepository sampleIndexRepository, + IAElfIndexerClientEntityRepository sampleTransactionIndexRepository, + IOptionsSnapshot contractInfoOptions, + IObjectMapper objectMapper) : base(logger) + { + SampleTransactionIndexRepository = sampleTransactionIndexRepository; + SampleIndexRepository = sampleIndexRepository; + ContractInfoOptions = contractInfoOptions.Value; + ObjectMapper = objectMapper; + } + + public override string GetContractAddress(string chainId) + { + return ContractInfoOptions.ContractInfos.First(c => c.ChainId == chainId).SampleContractAddress; + } + + protected override async Task HandleEventAsync(SampleEvent eventValue, LogEventContext context) + { + // implement your handling logic here + } + } + +LogEvent + +Processing block transaction data of LogEvent structure type primarily involves handling LogEventInfo. +To do this, you need to inherit from the ``AElfLogEventProcessorBase`` class, override and implement its +``GetContractAddress`` and ``HandleEventAsync`` methods. + +.. code:: c# + + public class SampleLogEventProcessor : AElfLogEventProcessorBase + { + private readonly IAElfIndexerClientEntityRepository _repository; + private readonly ContractInfoOptions _contractInfoOptions; + private readonly IObjectMapper _objectMapper; + + public NFTProtocolCreatedProcessor(ILogger logger, IObjectMapper objectMapper, + IAElfIndexerClientEntityRepository repository, + IOptionsSnapshot contractInfoOptions) : base(logger) + { + _objectMapper = objectMapper; + _repository = repository; + _contractInfoOptions = contractInfoOptions.Value; + } + + public override string GetContractAddress(string chainId) + { + return _contractInfoOptions.ContractInfos.First(c => c.ChainId == chainId).SampleContractAddress; + } + + protected override async Task HandleEventAsync(SampleEvent eventValue, LogEventContext context) + { + // implement your handling logic here + } + } + +Block + +Processing block structure type block data mainly involves handling BlockInfo. To do this, you need to inherit +from the ``BlockDataHandler`` class and override and implement its ``ProcessDataAsync`` method. + +.. code:: c# + + public class SampleBlockProcessor : BlockDataHandler + { + private readonly IAElfIndexerClientEntityRepository _repository; + + public SampleBlockProcessor(IClusterClient clusterClient, IObjectMapper objectMapper, + IAElfIndexerClientInfoProvider aelfIndexerClientInfoProvider, + IAElfIndexerClientEntityRepository repository, + ILogger logger) : base(clusterClient, objectMapper, aelfIndexerClientInfoProvider, logger) + { + _repository = repository; + } + + protected override async Task ProcessDataAsync(List data) + { + foreach (var block in data) + { + var index = ObjectMapper.Map(block); + Logger.LogDebug(index.ToJsonString()); + await _repository.AddOrUpdateAsync(index); + } + } + + protected override Task ProcessBlocksAsync(List data) + { + // implement your handling logic here + } + } + +Create ``.cs`` under ``src/Processors`` folder. Here is the sample code: + +.. code:: c# + + using AElfIndexer.Client; + using AElfIndexer.Client.Handlers; + using AElfIndexer.Grains.State.Client; + using Microsoft.Extensions.Logging; + using Microsoft.Extensions.Options; + using Sample.Indexer.Entities; + using AElf.Contracts.HelloWorld; + using IObjectMapper = Volo.Abp.ObjectMapping.IObjectMapper; + + namespace Sample.Indexer.Processors + { + public class SampleTransactionProcessor : AElfLogEventProcessorBase + { + private readonly IAElfIndexerClientEntityRepository _sampleIndexRepository; + private readonly ContractInfoOptions _contractInfoOptions; + private readonly IObjectMapper _objectMapper; + + public SampleTransactionProcessor(ILogger logger, + IAElfIndexerClientEntityRepository sampleIndexRepository, + IOptionsSnapshot contractInfoOptions, + IObjectMapper objectMapper) : base(logger) + { + _sampleIndexRepository = sampleIndexRepository; + _objectMapper = objectMapper; + _contractInfoOptions = contractInfoOptions.Value; + } + + public override string GetContractAddress(string chainId) + { + return _contractInfoOptions.ContractInfos.First(c => c.ChainId == chainId).SampleContractAddress; + } + + protected override async Task HandleEventAsync(SampleEvent eventValue, LogEventContext context) + { + if (eventValue.PlayerAddress == null) + { + return; + } + + var indexEntry = new SampleIndexEntry + { + Id = eventValue.PlayerAddress, + FromAddress = eventValue.PlayerAddress, + Timestamp = eventValue.Timestamp, + Amount = eventValue.Amount + }; + _objectMapper.Map(context, indexEntry); + await _sampleIndexRepository.AddOrUpdateAsync(indexEntry); + } + } + } + +11. Register Processors and other indexer plugin services +This module inherits from the AElfIndexer plugin base class. It configures and registers services. +Create the project file ``IndexerModule.cs`` under ``src`` folder. Here is the sample code of it: + +.. code:: c# + + using AElfIndexer.Client; + using AElfIndexer.Client.Handlers; + using AElfIndexer.Grains.State.Client; + using Microsoft.Extensions.DependencyInjection; + using Sample.Indexer.GraphQL; + using Sample.Indexer.Handlers; + using Sample.Indexer.Processors; + using Volo.Abp.Modularity; + + namespace Sample.Indexer + { + [DependsOn(typeof(AElfIndexerClientModule))] + public class SampleIndexerModule : AElfIndexerClientPluginBaseModule + { + protected override void ConfigureServices(IServiceCollection serviceCollection) + { + var configuration = serviceCollection.GetConfiguration(); + serviceCollection.AddSingleton, SampleTransactionProcessor>(); + serviceCollection.AddTransient(); + // register your own processors and service here + Configure(configuration.GetSection("ContractInfo")); + } + + protected override string ClientId => ""; + protected override string Version => ""; + } + } + + +Step 3. Testing +------------------------------------ + +Testing the Indexer plugin locally can be complex as it requires simulating the entire Indexer application. It is +recommended to directly pull the "test" directory from this repository +``https://github.com/xibo1/aelf-indexer-demo/tree/dev`` for a more straightforward testing environment. + +Then, add the necessary test cases in the ``Sample.Indexers.Tests`` folder. The basic idea of writing test cases is +to simulate the input data of processors, then check if the data can be queried from elasticsearch. +Here is the sample code of a unit case: + +.. code:: c# + + [Fact] + public async Task HandleSampleEvent_Test() + { + const string chainId = "AELF"; + const string blockHash = "3c7c267341e9f097b0886c8a1661bef73d6bb4c30464ad73be714fdf22b09bdd"; + const string previousBlockHash = "9a6ef475e4c4b6f15c37559033bcfdbed34ca666c67b2ae6be22751a3ae171de"; + const string transactionId = "c09b8c142dd5e07acbc1028e5f59adca5b5be93a0680eb3609b773044a852c43"; + const long blockHeight = 200; + var blockStateSetAdded = new BlockStateSet + { + BlockHash = blockHash, + BlockHeight = blockHeight, + Confirmed = true, + PreviousBlockHash = previousBlockHash + }; + + var blockStateSetTransaction = new BlockStateSet + { + BlockHash = blockHash, + BlockHeight = blockHeight, + Confirmed = true, + PreviousBlockHash = previousBlockHash + }; + var blockStateSetKey = await InitializeBlockStateSetAsync(blockStateSetAdded, chainId); + var blockStateSetKeyTransaction = await InitializeBlockStateSetAsync(blockStateSetTransaction, chainId); + var sampleEvent = new SampleEvent + { + PlayerAddress = Address.FromPublicKey("AAA".HexToByteArray()).ToString()?.Trim('\"'), + Timestamp = 1702968980, + Amount = 100000000 + }; + var logEventInfo = new LogEventInfo + { + ExtraProperties = new Dictionary + { + { "Indexed", sampleEvent.ToLogEvent().Indexed.ToString() ?? string.Empty }, + { "NonIndexed", sampleEvent.ToLogEvent().NonIndexed.ToBase64() } + }, + BlockHeight = blockHeight, + ChainId = chainId, + BlockHash = blockHash, + TransactionId = transactionId + }; + var logEventContext = new LogEventContext + { + ChainId = chainId, + BlockHeight = blockHeight, + BlockHash = blockHash, + PreviousBlockHash = previousBlockHash, + TransactionId = transactionId, + Params = "{ \"to\": \"ca\", \"symbol\": \"ELF\", \"amount\": \"100000000000\" }", + To = "CAAddress", + MethodName = "Played", + ExtraProperties = new Dictionary + { + { "TransactionFee", "{\"ELF\":\"30000000\"}" }, + { "ResourceFee", "{\"ELF\":\"30000000\"}" } + }, + BlockTime = DateTime.UtcNow + }; + var sampleProcessor = GetRequiredService(); + await sampleProcessor.HandleEventAsync(logEventInfo, logEventContext); + sampleProcessor.GetContractAddress(chainId); + + // step 4: save blockStateSet into es + await BlockStateSetSaveDataAsync(blockStateSetKey); + await BlockStateSetSaveDataAsync(blockStateSetKeyTransaction); + await Task.Delay(2000); + + var sampleIndexData = await _sampleIndexRepository.GetAsync(Address.FromPublicKey("AAA".HexToByteArray()).ToString()?.Trim('\"')); + sampleIndexData.ShouldNotBeNull(); + sampleIndexData.Amount.ShouldBe(100000000); + } + +Before running the test cases, elasticsearch is also needed. The latest version of it can be downloaded from: +``https://www.elastic.co/downloads/elasticsearch`` + +Step 4. Deployment of Indexer +------------------------------------ + +Compile the developed indexer project, and obtain the compiled DLL file. Hand over the compiled ``Sample.dll`` file to the +administrator of the AeFinder system. The administrator will place the ``Sample.dll`` file into the ``plugIns`` folder +within the DApp module of the AeFinder system. + +.. code:: bash + + ubuntu@protkey-did-test-indexer-a-01:/opt/aelf-indexer/dapp-bingo/plugins$ ls + BingoGame.Indexer.CA.dll + +Subsequently, the AeFinder system will automatically initiate the process of pushing blocks to the interface plugin +for processing, adhering to the pre-subscribed requirements, and simultaneously expose the corresponding GraphQL interfaces +to external entities. The GraphQL interface address will be ``http://URL:{port}/AElfIndexer_DApp/SampleSchema/graphql`` +This playground can check whether the indexer works properly, e.g. The playground for bingogame indexer: + +.. image:: playground.png + :alt: Playground + +Conclusion +------------------------------------ + +By following these steps, DApps can seamlessly integrate with the AeFinder, enabling efficient retrieval and processing +of on-chain data. This comprehensive guide gives introduction and ensures a smooth development process. \ No newline at end of file diff --git a/docs-sphinx/tutorials/indexer/index.rst b/docs-sphinx/tutorials/indexer/index.rst new file mode 100644 index 0000000000..ff73922a3a --- /dev/null +++ b/docs-sphinx/tutorials/indexer/index.rst @@ -0,0 +1,8 @@ +Building an AeFinder plugin +========================== + + +.. toctree:: + + AeFinder Introduction + Building an AeFinder Plugin \ No newline at end of file diff --git a/docs-sphinx/tutorials/indexer/indexer-overall.png b/docs-sphinx/tutorials/indexer/indexer-overall.png new file mode 100644 index 0000000000..bda2f2a05a Binary files /dev/null and b/docs-sphinx/tutorials/indexer/indexer-overall.png differ diff --git a/docs-sphinx/tutorials/indexer/indexer-plugin.png b/docs-sphinx/tutorials/indexer/indexer-plugin.png new file mode 100644 index 0000000000..dca81dce01 Binary files /dev/null and b/docs-sphinx/tutorials/indexer/indexer-plugin.png differ diff --git a/docs-sphinx/tutorials/indexer/playground.png b/docs-sphinx/tutorials/indexer/playground.png new file mode 100644 index 0000000000..cf867bd24c Binary files /dev/null and b/docs-sphinx/tutorials/indexer/playground.png differ diff --git a/docs-sphinx/tutorials/indexer/subscription_version.jpeg b/docs-sphinx/tutorials/indexer/subscription_version.jpeg new file mode 100644 index 0000000000..30693bbc13 Binary files /dev/null and b/docs-sphinx/tutorials/indexer/subscription_version.jpeg differ