-
Notifications
You must be signed in to change notification settings - Fork 2
Dev 1.5.8 as Next Stable Release #539
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Reverse sync back into dev
Bumps the all-actions group with 1 update: [softprops/action-gh-release](https://github.com/softprops/action-gh-release). Updates `softprops/action-gh-release` from 2.4.2 to 2.5.0 - [Release notes](https://github.com/softprops/action-gh-release/releases) - [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md) - [Commits](softprops/action-gh-release@v2.4.2...v2.5.0) --- updated-dependencies: - dependency-name: softprops/action-gh-release dependency-version: 2.5.0 dependency-type: direct:production update-type: version-update:semver-minor dependency-group: all-actions ... Signed-off-by: dependabot[bot] <support@github.com>
…/dev/all-actions-20dc1db0c3 Bump softprops/action-gh-release from 2.4.2 to 2.5.0 in the all-actions group
Bumps the all-actions group with 1 update: [actions/checkout](https://github.com/actions/checkout). Updates `actions/checkout` from 6.0.0 to 6.0.1 - [Release notes](https://github.com/actions/checkout/releases) - [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md) - [Commits](actions/checkout@v6.0.0...v6.0.1) --- updated-dependencies: - dependency-name: actions/checkout dependency-version: 6.0.1 dependency-type: direct:production update-type: version-update:semver-patch dependency-group: all-actions ... Signed-off-by: dependabot[bot] <support@github.com>
…/dev/all-actions-2507bcfa80 Bump actions/checkout from 6.0.0 to 6.0.1 in the all-actions group
1. Bump Minimum Supported Versions 2. Forcefully Trigger Webs_update on the Nodes 3. Minor Cleanup/Tuning
Parallel Node Processing. This should speed up the checking of nodes to a single 8 second wait.
Update Dev Tag
1. Additional Cleanup of Unused code 2. Don't logout until the final phase.
Update Dev Tag
Add Back Missing Comment
|
Ready for your review when your free and available :) Thanks again! Feel free to submit anything to dev if you have any outstanding changes you'd like included and happy new year! |
Sorry, bud. I had no time today to fully catch up with my secondary emails and GitHub messages. My wife and I spent the day running several errands, buying some food, groceries, etc., and taking care of other personal/family stuff. Right now we're getting ready to leave home for my parents' house, where the family will get together to celebrate New Year's Eve. I'll take a look and review the PR changes tomorrow. I did briefly read your synopsis of the changes made, and the "parallel background node processing" looks like a great idea to optimize that part of the code, especially for users having more than 1 or 2 nodes. Also, triggering the "webs_update.sh" script on each node before checking for any F/W update is an excellent idea. I'm going offline now. Talk to you tomorrow. Enjoy your New Year's Eve celebration!! |
I tested the change with my single node, but of course it's only one node so for me it makes no difference but I also had @TheS1R test with success and he has 3 Merlin nodes. All 3 were triggered and checked within the 8 second time frame as expected. I think it's the best way to solve the issue. Initially I thought about bypassing the requirements for webs_update on the node or "detaching " that requirement. But the real solution is just find a way to trigger the built in script remotely which is what I found with curl. Then MerlinAU is always accurate and has an "up to date view" when it runs on the primary. Thanks again for the nice feedback! No rush I just thought I'd shoot my shot for today haha 😂 enjoy the family time and happy new years!! Chat in the new year 🙂 |
Miscellaneous code improvements and fine-tuning.
Yeah, that was a very nice solution so that the main/primary router gets the latest F/W update status for all mesh nodes when needed. Yesterday evening, I reviewed the PR changes, and overall, they look good. Then I also made some code improvements and fine-tuning, but I was too tired by the time I finished late last night to perform testing and validation. So I did that this morning and also ran the latest script through the Linter tool. No errors found and no new warnings - we got "clean" code. 😉
I'll be submitting a PR shortly with my latest modifications. Happy New Year, bud!!! 🎄🥳 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good!!
I made some code improvements and fine-tuning (PR #540), so when everything has been reviewed and validated, and things are OK, we can merge the 2 PRs together for the next release.
I think the user will be happy we found a solution, I know I am! It was always something I was aware of but I was okay living with until someone else noticed the dependency and called it out ;) Hahaha
I love myself some clean code, scrub that code clean @Martinski4GitHub !!!! |
Fantastic! I'm moving over there now to do a code review and some testing! |
|
@ExtremeFiretop , let me know once it's merged into dev, and I can re-verify as well. |
Just in the middle of my code review now, shouldn't be much longer. |
Code Improvements
Please feel free to test again with your 3 nodes, please make sure to validate that the log file for the built-in update check is triggered on the nodes as I showed you via email :) |
|
@ExtremeFiretop Tested and validated log file timestamps for all three (3) AiMesh nodes. |
Sounds like we are ready to make it fly! |
Fly using @tech9's airplane? 🤣 |
Martinski4GitHub
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ready and good to go!!!
Thank you for helping us test and validate the changes in the PRs. Your collaboration is much appreciated. |
@TheS1R has a very "busy" environment with lots of devices setup to play with, he is a great person to test for this reason. I hope he knows how valuable that is! |
I'm more than happy to assist as needed. |
Forms post update = COMPLETED
Readme update = COMPLETED
Version.txt update = COMPLETED
Dev 1.5.8 as Next Stable Release
What's Changed/Fixed?:
COMMIT: [ #bae3640 ] - Pre-emptively Trigger webs_update on Nodes
This should solve an "issue" reported by Untried3868 here: https://www.snbforums.com/threads/merlinau-v1-5-7-the-ultimate-firmware-auto-updater-webui-gnuton-support.91326/post-976770 where the primary did not detect any updates for the nodes until the node was self-aware of an update.
Now we force trigger the node to run the built-in webs_update script to be self-aware before querying them for available updates.
We are now updating that to 3 versions behind production. Last update in this regard was about 9 months ago.
COMMIT: [ #5e125e2 ] - Parallel Background Node Processing
This should speed up the checking of nodes to a single 8 second wait instead of my previous implementation in commit bae3640 which did 8 seconds PER-node.
COMMIT: [ #ece930e ] - Additional Cleanup / No Logout
When I initially designed the node functionality, I had it pull additional values from the nodes with the intent of potentially expanding the feature some day. Now I've realized that pulling these values is unnecessary and the feature is as mature as it will ever be.
Previously my implementation in commit 5e125e2 was doing multiple logins and logouts for each task...
( i.e LOGIN → TRIGGER → LOGOUT → WAIT → LOGIN → FETCH → PARSE → LOGOUT).
Now we do a LOGIN → TRIGGER → WAIT → FETCH → PARSE → LOGOUT
PR: [ #540 ] - Code Improvements
As always, we highly recommend you update ASAP as this includes functional improvements and little bug fixes.
Thanks!