Building a quad as a platform for research

Started by ayu135, March 04, 2014, 04:45:15 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Swapnil

Hmmm... It's just that the BLDC might get even more damaged while you are trying to fix it.

Have you tried searching for it at the local stores?

rchobbyaddict

Having strong techniqual strong base we develop high performance, reliable, and easy to use unmanned aerial vehicle (UAV), for commercial and recreational.
for more info visit my page: https://www.facebook.com/funaster

ayu135

I managed to get the bearing out, but had to damage the motor a little, i made a small cut in the aluminium behind the bearing which allowed me to get some leverage to push the casing out. Installed new bearings, and checked the motor for any performance issues, so far nothing has come up.

ayu135

The quad is finally complete now and is surprisingly stable. I am now going to add the apm with the gps instead of the kk.

In other news i have been offered a research internship at Robotics Reasearch Center at IIIT-Hyderabad for this summer. So i will be continuing my research there. Hopefully i will be able to make substantial progress over the summer particularly in the vision based navigation.

chawlap

Please post if any further progress has been made. I am building a object following quad using APM 2.6 as a research project @ IIT Kanpur. So, the core part of my project is also the same as yours. The only difference is that I am planning to do the video processing off board. I need to control the quad using a C/C++ code. Any advice/comments would be appreciated :)

ayu135

Updates:
The initial project i was planning has been pushed back a bit. Here at the IIIT-H i am working on a rudimentary ground station using MAVLink, where i will try to control the quad by using the laptop. This will then be integrated to fly the quad using hand gestures via a kinect.

So for i have written a basic console application that can read and send packets to the ardupilot mega via both the usb and the 433 mhz telemetry link over serial. The app is in c# right now and uses a lot of code from the mission planner. I plan to move to C++ as that is more familiar ground for me, but understanding mavlink is going to take a while.

@chawlap A lot of what you want to do would depend on what setup you have. If your quad is based on ardupilot mega or pixhawk i would suggest you have a look at the mavlink protocol for the communication. The mission planner is written in C# and is open source and so is mavlink, have a look at their code and try to figure out how to communicate with the quad. I can tell you what to do specifically for ardupilot as i am working on that.

chawlap

@ayu135 Thank You for your reply!  :)

I did went through the working of Mavlink and ArduPilot Mega ( I'd also be using ardupilot  ;) ) and what I inferred was that follow me mode would be the most suitable for object following and I can simply give the target coordinates of the point where I want my copter to go. But along with this, I would also need to control the yaw of my copter when I want it to change its orientation while standing at the same point. So, do you know of any methods by which I can directly control the yaw of my copter. Specifically, what commands would I need to send using Mavlink?

Any other comments/suggestions also appreciated. :)

ayu135

Well you could look at the command rc_channel_override, what it does is overrides the rc input. It takes a value between 1000-2000 for each channel. A value of 0 for a particular channel means you dont wish to override it. You can have a look at the mavlink documentation for a complete list of all the commands available and how they are structured. You would need to write a program that will send the packets via serial for you.

You can have a look at the mavlink repository on github ( https://github.com/mavlink/mavlink ), clone it or download via zip and you will find a mavgenenerate.py in it. What this mavgenenerate does is it takes the xml definition file found in the message_definitions folder and then generates the appropriate headers in C,Java,Pyhton,C# etc.(you can choose the language) these headers contain all the enums and parse funtions for all the messages, it also has methods for encoding, decoding and sending messages using serial.

Follow me mode needs a special usb gps module and gps doent work indoors, and you'll have to mount a laptop an that object with this dongle attached. When you said object following i though it was through vision. Is your project on following object outdoors with gps reception?

chawlap

Nopes! You had got me right the first time, I am working on vision based object following only.

What I was thinking was this : We can get the GPS coordinates and the direction in which our quad is facing via Mavlink. Right? We'd also be getting the live video from the quad through some wireless system. Now, if our target object is going to the left part of the frame of video (i.e. we want our quad to turn left while going forward) we can send 'fake' GPS coordinates (fake in the sense that we wave not got these from any GPS device)  which are front, left with respect to the quad. So, now the quad will try to go to this 'fake' GPS coordinate and in the process, it will turn left while going forward.

I am sorry if I am getting anything trivial wrong, I am a complete newbie and never got a chance to deal with this stuff. But does this sound possible to you? I would be happy to work on any other option also if it seems better to you. :)

Swapnil

@chawlap:
All you need to do is use a good object-tracking algorithm and send the appropriate control signals (in this case 'yaw signals'). You don't have to mess with the GPS system.

ayu135

Another round of updates:
The work with quad is progressing, really enjoying the work here. Getting to learn a lot about quads in depth.

1)The ground station is now a gui based proper windows application, we can now control the roll,pitch and yaw of the quad using wasd keys,change the mode of the ardupilot, arm/disarm, it can read all the imu data from the ardupilot and display it, it can send set_roll_pitch_yaw_throttle command to the quad for more precise autonomous control of the quad instead of the rc override commands.

2)On the ardupilot side after fiddling around with the arducopter code a bit we found that the atmega processor is already pretty taxed by the existing code and adding any more functionality was not possible. We are now planning to rewrite our own controller from scratch using the ardupilot libraries as much as we can. Possibly we may be looking to move over to px4fmu+io or a pixhawk but that depends on whether we can accomplish it using the ardupilot after coding from scratch

3)After trying to get the position estimates using the gyro and the accel we realised that the errors are too large to be realiable after about 300ms of integrating. We are now trying ot get the px4flow sensor to work with the apm. The px4flow is indeed a nifty little sensor that might solve our problems to a large extent.

4)The gesture control using kinect was implemented though not optimally, the quad was too unstable to be controlled only via gestures. The gesture recognition on the other hand was impeccable. Once we get the quad to stably hover indoors using the px4flow the quad would be easier to control.

5)After we get the quad to hover stably in one place indoors witout drifting our next task is to get to follow simple coordinates sent via mavlink and navigate to them autonomously and then have the quad make basic figures like a square autonomously by sending the coordinates of the points.

Gotta get back now! And really sorry that i cannot post pictures or videos of the quad right now. I will surely try to upload once we are done with this project here.