@AhmadLight: LectureVideos entirely relies on the Android MediaCodec and does neither contain any codecs nor any other encoding, decoding, multiplexing, or demultiplexing code (mainly to avoid legal problems). The Android MediaCodec APIs of Android 4.1 and 4.2 do not provide a multiplexer to create a container format such as MP4, AVI, or MKV. Therefore, LectureVideos produces two files per session, an H264 file containing the encoded video signal and a 3GP container file containing the encoded audio signal.
On devices running Android 4.1 and 4.2, you need an external tool such as FFmpeg to multiplex the video and audio signals into a container file. Lets say you have video1.h264 (the video stream) and video1.3gp (the audio stream), you can produce an MP4 file using the command
ffmpeg -r 12 -i video1.h264 -i video1.3gp -b 1.5M video1.mp4
where `-r 12´ states the frame rate (12 frames per second) and `-b 1.5M´ sets the output bit rate to 1.5Mbits per second. The resulting MP4 container file video1.mp4 can be viewed with all video viewers and uploaded to YouTube and other video platforms. There are graphical user interfaces available for essentially all OS, so that you do not need to worry about command line options.
There are FFmpeg ports for Android available in Google Play, but I would rather recommend to perform the multiplexing on a `normal´ computer.
There are alternatives to ffmpeg such as mkvmerge (which produces an MKV container file), one LectureNotes user published a short video on that,
http://www.youtube.com/watch?v=kTTc-2EtPAw.
On devices running Android 4.3 and 4.4, you can still use an external tool to multiplex the video and audio signals into a container format (which can be advisable if you have specific optimization needs, for instance for streaming); furthermore, you can share the video as an MP4 container file in LectureNotes, and LectureVideos uses the Android MediaCodec API to multiplex the video and audio signals into the MP4 container format prior to sharing.